• Nem Talált Eredményt

Dual Mouse-SPIM

2.4 Control unit

specimen with the microscope, where the beads can act as a reporter of the point spread function of the microscope. The alignment can be performed “live” by inspecting the bead image quality for aberrations. By gradually changing the correction collar, the ring artifacts are minimized on out-of-focus beads, and the peak intensity is maximized for in-focus beads. By moving the correction ring, the focal plane is also slightly shifted, which has to be compensated by shifting the light-sheet correspondingly to coincide with the correct imaging plane.

Adjusting the field of view To allow for proper sampling of the image, we use 50× magnification, which combined with the 6.5µm pixel pitch of our sCMOS cam-era, will result in a 0.13µm pixel size. The full field of view recorded by the camera is 2048×0.13µm = 266.24µm. To ensure the best image quality, we align the center of the objective field of view on the camera sensor, since this region has the best optical properties in terms of numerical aperture, aberration correction and flatness of field.

Field of view alignment can be performed by using mirror M5 just before the detection merging unit. To identify the center region of the field of view, diffuse white light is used to illuminate the entire sample chamber, and it is imaged on the camera. Then, mirror M5 is adjusted until the top edge of the field of view becomes visible, i.e., where the illumination from the chamber is clipped. This will have a circular shape. Afterwards, adjusting the mirror in the orthogonal direction, the left-right position of the field of view can be adjusted, by centering the visible arc on the camera sensor.

After the horizontal direction is centered, vertical centering is performed. This, how-ever can not be centered the same way as the horizontal direction, since for that we would have to misalign the already aligned horizontal position. To determine the center, we move the field of view from the topmost position to the bottom. During this process the number of turns of the adjustment screw is counted (this can be done accurately by using a hex key). After reaching the far end of the field of view, the mirror movement is reversed, and the screw is turned halfway to reach the middle.

2.4 Control unit

The microscope’s control and automation is performed by an in-house designed modular microscope control system developed in LabVIEW [96]. The core of the system is a National Instruments cRIO-9068 embedded system that features an ARM Cortex A9 processor and a Xilinx Zynq 7020 FPGA. Having both chips in the same device is a great advantage, since the main processor can be used to run most of the microscope control software, while the FPGA can be used to generate the necessary output signals in real time and with high precision.

2.4 Control unit

The embedded system is complemented by a high-performance workstation that is used to display the user interface of the microscope, and to record the images of the high-speed sCMOS camera.

2.4.1 Hardware

Various components need to be synchronized with high precision to operate the micro-scope: a laser combiner to illuminate the sample; a galvanometric scanner to generate the light-sheet; stages to move the samples; filter wheel to select the imaging wavelengths;

and a camera to detect the fluorescence signal. For high speed image acquisition, all of these devices have to be precisely synchronized in the millisecond range, and some even in the microsecond range. Although they require different signals to control them, we can split them into three main categories:

digital input analog input serial communication camera exposure galvanometric position filter wheel

laser on/off (×3) laser intensity (×3) stages (×2)

All devices are connected to the NI cRIO 9068 embedded system, either to the built in RS232 serial port, or to the digital and analog outputs implemented by C-series expansion modules (NI 9401, NI 9263, NI 9264). The workstation with the user interface is communicating with the embedded system through the network. The only device with a connection to both systems is the camera: the embedded system triggers the image acquisition, and the images are piped to the workstation through a dual CameraLink interface, capable of a sustained 800 MB/s data transfer rate (Figure 2.7).

2.4.2 Software

Being able to precisely control all of the instruments not only relies on the selected hard-ware, but just as much on the software. Our custom software is developed in LabVIEW, using an object-oriented approach with the Actor Framework. The embedded system is responsible for the low-level hardware control, for keeping track of the state of all devices, saving the user configurations, and automating and scheduling the experiments. It also offers a Javascript Object Notation (JSON) based application programming interface (API) through WebSocket communication. This is mainly used to communicate with the user interface, however it also offers the possibility of automated control by an external software.

FPGA software

The on-board FPGA is responsible for generating the digital and analog output signals based on the microscope settings (Figure 2.8). To avoid having to calculate all the traces

2.4 Control unit

FW stage laser galvo n × camera network

10 Gbit/s 10 Gbit/s

1 Gbit/s

Figure 2.7: Microscope control hardware and software architecture. The embedded system re-sponsible for the hardware control and the workstation are communicating through the network using the WebSocket protocol. The electronic devices of the microscope are either controlled through digital/analog signals, or through serial communication. The camera is also connected to the workstation with a CameraLink connection, which transmits the recorded images.

0 10 20 30 40 50 60

Figure 2.8: Digital and analog control signals. Example traces for recording a single plane with50 ms exposure time and12 msdelay. During the exposure of the camera the scanner has 3 sections: acceleration (acc.), linear, and deceleration (dec.). The laser line is synchronized with the scanner linear section to ensure even illumination.