• Nem Talált Eredményt

1.1 Commercial Products Available

1.1.1 Surveillance Cameras

A typical surveillance camera contains an image sensor to generate a spatial, two-dimensional electronic signal from physical measurement. Most of these sensors operate in the visible spectrum, but recently other sensor types have become commercially available (e.g. thermal camera). The objective of the camera contains the lens elements, which capture the photons and focus it on the image sensor. Moreover, depending on the buildup of the objective, the camera can also provide optical zooming functionality. Most surveillance cameras oer both day and night functionality.

During the day when the illuminance level is high, the infrared light is ltered out by using an IR-cut lter to prevent unwanted color distortions. In night mode when the illuminance level is low, the IR-cut lter is removed to make use of near infrared light to deliver black and white images.

The objective is attached to the camera body, which contains the image sensor and other elements.

The body can also be attached to a base, where a small motor is used for orientation-positioning the camera. The analog and digital input/output and external synchronization interfaces are also integrated into the body.

1.1.1.1 Image Sensors

The objective of the camera focuses the light on the camera's sensor. The image sensor consists of picture elements (pixels). From the input optical image the sensor's pixels produce electronic

1.1.1.1. Image Sensors 3

signal. The two most commonly used technologies for image sensors are the CCD (charge-coupled device) and the CMOS (complementary metal-oxide semiconductor). In case of CCD sensors the charge of the pixels are transported line by line through the sensor into a charge amplier to convert the charge into voltage, which is the analog output electronic signal. Then an analog-to-digital converter turns each pixel's value into a analog-to-digital number. In case of CMOS each single pixel has an integrated complementary transistor circuit which amplies the signal and assigns a digital value to it, i.e. each pixel can be accessed individually. This technology enables the integration of complex circuitry on the sensor (e.g. ampliers, noise-correction, or digitization circuits). If camera functions are implemented, each pixel can be optimized regarding e.g. brightness, range of contrast and white balance. In most single-sensor cameras a color lter array (CFA) is placed in front of the sensor's pixels to capture color information. The CFA is a mosaic of color lters in a regular pattern (e.g. Bayer lter).

The list below summarizes the practical problems of image sensors that should be considered from the image processing point of view:

• Due to the complex architecture, the light sensitivity of a CMOS chip tends to be lower than that of a CCD sensor.

• Compared to the traditional CMOS chips, the manufacturing process of a CCD sensor leads to a high quality chip in terms of light sensitivity, and CCD cameras create somewhat less noisy images than CMOS sensors.

• Smear eect: When a bright light source is present in the eld of view, a vertical white stripe appears on the image. This eect can only appear when a CCD sensor is used. The problem originates from the read out process of the sensor: the lines are shifted across the sensor to the bottom, therefore the electrical charges passing the bright light source are raised. In case of CMOS sensors the charge is read out from the pixel, hence this eect cannot appear.

• Blooming eect: In case of CCD sensors there is a limit on how much charge a pixel can store.

Too much charge can cause an overow to the neighboring pixels which is called blooming.

The overow can also spread to the neighboring elements.

• Quantization error: The sensed image is an analog signal, and an analog-to-digital converter is used to map the analog values to discrete levels. The information loss during this conversion is called quantization error.

• Aliasing error: This type of error occurs when the sampling frequency is less than twice the

1.1.1.2. Image Scanning Techniques 4

highest frequency of the signal. In digital imaging this eect becomes visible at patterns containing high spatial frequencies. In most cameras anti-aliasing lters can be found in front of the sensor to eliminate this error.

• Thermal noise: The thermal agitation of electrons in a conductor generates electronic noise, and is called thermal noise.

• Hot pixels: Almost all CCDs and CMOS chips contain sensor elements with abnormally high rate of charge leakage. As a result small bright points will appear in the image.

1.1.1.2 Image Scanning Techniques

For today's cameras two techniques are available for reading out the information from the image sensors. Interlaced scanning, mainly used in CCD sensors, produces the image from two separate elds: the rst displays the odd, the second displays the even horizontal lines. The sensor updates each eld of a video frame separately: it rst updates the odd eld, and in the next cycle the even eld. Therefore, the full frame is not available until after two cycles. In case of the progressive scan of a video the sensor progressively scans all the lines of an image, producing a full frame image at every time step.

When an interlaced video is displayed on progressive scan monitors (e.g. modern CRT or LCD computer monitors), which scan the lines of an image consecutively, the delay of the even and odd lines cause visible artifacts, for example when a fast moving target is present. Moreover, a second artifact can be noticed when the video frame contains patterns having high vertical frequency (e.g. a person's cloth containing stripes) which approach the horizontal resolution of the video format. This artifact is called interline twitter, and such patterns are twittering when displayed.

1.1.1.3 Active Infrared Technology

Ordinary CCD cameras can be combined with infrared illumination of the target to get an eective night time imaging device which operates under low illuminance levels. Near-infrared light is within the 700-1000 nm wavelength range, and although it is not visible for the human eye, it can be detected by most image sensors. Most active infrared cameras produce a monochrome image as an output. A camera that operates both in day and night conditions typically includes an IR-cut lter, which lters out the IR light in the day mode so that the colors will not get distorted.

1.1.1.4. Thermal Imaging 5

1.1.1.4 Thermal Imaging

The main disadvantage of the active infrared technology is the limited operating distance, which usually does not exceed 150 meters. Thermal imaging devices measure thermal radiation, therefore they require no active illumination. Unlike active infrared devices, thermal cameras can have a range up to 500 meters, which makes them suitable for outdoor surveillance tasks. On the other hand, the resolution of the thermal sensors is usually lower than that of ordinary image sensors.

Thermal devices are mainly used in military applications, but the price of these devices has started to drop recently, and they have become commercially available.

1.1.1.5 Lenses

The main function of the lens elements in the camera is to focus the light on the image sensor.

Besides that the lens assembly is responsible for the correct exposition by controlling the amount of light reaching the sensor. Finally, the angle of view is also dened by the lens assembly. According to the angle of view, lenses can be classied into the following classes:

• Normal lenses (25 -50): same angle of view as that of the human eye;

• Wide-angle lenses (60-100): large angle of view with good low-light performance, provides less detail than the normal lens;

• Fisheye lenses (up to180): extremely large angle of view;

• Telephoto lenses (10 -15): narrower angle of view, provides ne details and is used when the object is located far from the camera;

• Omnidirectional vision: captured images usually depict the full360angle of view (horizontal, vertical or hemispherical).

Using lenses in cameras will introduce some degree of aberration in the process of image formation.

Therefore the manufacturers design their products with care to minimize these aberrations. The list below summarizes the typical aberrations caused by the lenses.

• Spherical aberration: Lenses used in cameras usually have spherical shape, which is relatively easy to manufacture. However, light beams parallel to the optic axis of the lens but at dierent distances from the axis are focused at dierent place. In case of beams distant from the axis this eect can be signicant. The result will be a blurry image. In high quality cameras usually aspheric lenses are used to compensate the spherical aberration.

1.1.1.6. Camera Types 6

• Comatic aberration (Coma): This aberration occurs when an o-axis object is imaged. Beams passing through the outer margins of the lens are focused at a dierent position than the beams passing the center and will produce in ring-shaped patterns, also known as comatic circles. These circles form a comet-like shape.

• Chromatic aberration: Lenses focus dierent wavelengths of light to a dierent distance.

This will result in fringes of color around the image, which is most notable along sharp edges and boundaries.

• Barrel distortion: This distortion is caused by the spherical shape of the lens and deforms the whole image. The further we go from the axis the less magnication is achieved. The straight lines are rendered as curves on the image sensor and the resulting image will appear as if it was mapped around a barrel.

• Pincushion distortion: This aberration is the opposite of the barrel distortion.

• Vignetting: Vignetting is the reduction of brightness at the corners compared to the center of an image. This is usually caused by the physical size of multiple lens elements.

• Astigmatism: Rays from a point far away from the axis are not focused into one point, but form two focal lines. If an optical system with astigmatism is used to form an image of a cross, the vertical and horizontal lines will be in sharp focus at two dierent distances.

Professional camera and lens manufactures provide complex lens assemblies for correcting the various aberrations. Moreover, commercial software tools are also available for correcting dierent geometrical distortions.

1.1.1.6 Camera Types

Surveillance cameras can be categorized into the following classes.

Fixed camera: The xed camera has a lens with xed angle of view. Therefore, this category has the simplest mounting, and usually one cable is enough for the access. When they are used in outdoor environment, xed cameras can be installed in housings.

Pan-Tilt-Zoom (PTZ) camera: A PTZ camera can be controlled (pan, tilt and zoom) manually by the operator or automatically by external devices. There are two possibilities to send the control commands: over the cable used for video transmission (used for digital PTZ cameras), or by using separate serial RS-232/422/485 type wire (used for analog PTZ cameras). When the zoom factor is high, outdoor PTZ cameras are sensitive to the vibrations caused by weather conditions or by