• Nem Talált Eredményt

Error Analysis of Attitude Estimation with Focal-Plane Processors for Guidance

N/A
N/A
Protected

Academic year: 2022

Ossza meg "Error Analysis of Attitude Estimation with Focal-Plane Processors for Guidance "

Copied!
2
0
0

Teljes szövegt

(1)

Abstract— In this paper the results of the error analysis of four different feature point based attitude estimator algorithm is introduced. The algorithms was tested in simulation with realistic flight paths and camera models.

With these results a best performing candidate algorithm can be chosen for a given focal-plane processor and for the given scenarios.

Index Terms—Attitude estimation, UAS, Focal-plane processor, Visual Navigation

I. I

NTRODUCTION

OR small mobile robots, especially for Unmanned Aircraft Systems (UAS), visual navigation (VisNAV) can be a good alternative against inertial guidance systems [1]. Furthermore VisNAV measurements can enhance redundancy in the navigation system or improve the accuracy of the attitude estimates (depending on the sensor capabilities). As we showed in [2] and [3], the feature point based visual attitude estimation can solve the drifting problem caused by the slow global navigation (GNSS) fused with the inertial navigation (INS).

Additionally in situations where the GPS signal is lost the visual information fused with INS gives better results than the INS itself [4]. One of the main drawbacks of the VisNAV is the high computational need for the image or video processing.

This problem can be addressed with focal-plane processors, as they are capable of processing images with high speed and low energy consumption [5]. The limited image resolution is the main drawback of these systems. In this paper the error analysis of four different attitude estimator algorithms are introduced with the image resolutions of the current focal-plane processors in simulations. These results show that with the best performing algorithm the mean error is around 1 pixel thus it can be used as an auxiliary navigation system.

II. M

ETHODS

Provided that the image feature points are calculated and paired for the consecutive images these point pairs can be used to estimate the camera rotation, that is the attitude change between the two frames. The details of the measurements cannot be written here because of the page restriction, but they can be found in [2].

978-1-4799-6007-1/14/$31.00 ©2014 IEEE

A. Feature point pair generation

For the simulations, a realistic flight path with sinusoidal shape is used (Figure 1), which was generated in our hardware- in-the-loop simulator. The feature points are placed randomly near to the ground around the flight path. After that the points are projected to the image plane.

Figure 1. Sinusoidal path in the NED frame

For the tests two focal-plane processors were selected, the SCAMP and the Eye-RIS system. Both of them were tested with two lenses, with 60° and 30° field-of-view. In each situation the test were run with 0.02-0.08 s image sampling time. The resolution of the camera is interesting as well, because the effect of the pixelization and spatial resolution is studied. A projective camera can be characterized by the angular resolution of the central pixel (or CPAR), which is defined as follows:

𝐶𝑃𝐴𝑅 = tan−11 𝑓

where 𝑓 is the focal length of the camera. With this measure cameras with different resolution and field of view can be compared.

Camera Eye-RIS Eye-RIS SCAMP SCAMP

FOV [°] 60 30 60 30

Resolution [px] 176×144 176×144 128×128 128×128 CPAR [°/px] 0.397 0.199 0.546 0.273 B. Attitude estimator algorithms

Four algorithms are tested in the scenarios described in the previous section. These are the five point algorithm, the eight point algorithm, a linear homography based algorithm, and a RANSAC variant, called MLESAC. From these four the five point, the eight point and the MLESAC are based on the epipolar geometry and the MLESAC is an iterative estimation.

Error Analysis of Attitude Estimation with Focal-Plane Processors for Guidance

for Mobile Robots

Tamas Zsedrovits, Akos Zarandy†*, Peter Bauer*, Balint Vanek*, Jozsef Bokor*, Tamas Roska

Pazmany Peter Catholic University, Faculty of Information Technology and Bionics, Budapest, Hungary,

*Institute for Computer Science and Control of the Hungarian Academy of Sciences (MTA SZTAKI) Budapest, Hungary

F

(2)

C. Error measures

In each and every step the direction cosine matrix (DCM) between the two frames is extracted which describes the rotation from one camera orientation to another. Based on this DCM the Euler angles are calculated and these are compared to the ground truth. To characterize the performance of each algorithm the absolute error of the three Euler angles are used.

𝑒𝑖= √(𝛼𝑖− 𝛼𝑖𝑚)2

where 𝛼𝑖 is the ground truth angle for the ith frame (roll, pitch or yaw) and 𝛼𝑖𝑚 is the calculated angle. Additionally, for each run also the mean, the median and the corrected standard deviation of the absolute error are calculated.

III. R

ESULTS

Simulation results showed that these two focal-plane processor can be used for attitude estimation. In the following only the results of the yaw angle calculation are shown. This is similar to the results for pitch and roll as well. The trends, which can be seen here, can be observed for the other test cases as well.

Figure 2. The mean of the absolute error of yaw angle with the five point algorithm

In Figure 2 the effect of the different spatial-temporal resolutions is shown for the five point algorithm. The results of the Eye-RIS are with black and blue, and the results of the SCAMP are with green and red. Because of the small resolution of these focal plane processors, the effect of the temporal change is almost negligible.

Figure 3. Yaw absolute error for Eye-RIS with the 30° lens

Figure 3

and

Figure 4

show the effect of the CPAR change on the four different algorithms.

Despite the smaller CPAR shown on Figure 3, which means bigger angular resolution, the error is bigger with the 30° lens. The reason is that with the smaller field-of-view, less feature point pairs can be extracted. Particularly in Figure 3

the time frames between 250 and 400, where the aircrafts was performing a turn, the eight point and the MLESAC could not give any results, because of the small number of feature points.

Figure 4. Yaw absolute error for Eye-RIS with the 60° lens

IV.

A

CKNOWLEDGMENT

This research project was supported by the University of National Excellence Program and the Research Faculty grant awarded to the Pazmany Peter Catholic University, Faculty of Information Technology and Bionics.

R

EFERENCES

[1] Y. Diskin, B. Nair, A. Braun, S. Duning, and V. K. Asari,

“Vision-based navigation system for obstacle avoidance in complex environments,” 2013 IEEE Appl. Imag. Pattern Recognit. Work., pp. 1–8, Oct. 2013.

[2] T. Zsedrovits, P. Bauer, A. Zarandy, and B. Vanek, “Error Analysis of Algorithms for Camera Rotation Calculation in GPS/IMU/Camera Fusion for UAV Sense and Avoid Systems,”

International Conference on Unmanned Aircraft Systems, 2014.

(submitted)

[3] T. Zsedrovits, P. Bauer, A. Zarandy, and B. Vanek, “Towards Real-Time Visual and IMU Data Fusion,” AIAA Guid. Navig.

Control Conf. Exhib., 2014.

[4] T. Chu, N. Guo, S. Backén, and D. Akos, “Monocular camera/IMU/GNSS integration for ground vehicle navigation in challenging GNSS environments”, Sensors (Basel, Switzerland), vol. 12, no. 3, pp. 3162–85, Jan. 2012.

[5] A. Zarandy, B. Pencz, M. Nemeth and T. Zsedrovits,

“Implementation of visual navigation algorithms on the Eye-RIS 1.3 system”, 2014 14th International Workshop on Cellular Nanoscale Networks and their Applications (submitted) [6] P. Dudek and S.J. Carey, A general-purpose 128× 128 SIMD

processor array with integrated image sensor, Electronics Letters, 42(12), 678–679, June 2006

[7] Á. Rodríguez-Vázquez, “The Eye-RIS CMOS vision system,”

Analog Circuit Design, pp. 15–32, 2008.

Hivatkozások

KAPCSOLÓDÓ DOKUMENTUMOK

In this paper we present the results of automatic error detection, concerning the definite and indefinite conjugation in the extended version of the HunLearner corpus, the

In this paper we present the results of automatic error detection, concerning the definite and indefinite conjugation in the extended version of the HunLearner corpus, the

BEFORE PERSUING THIS ANALYSIS, IT SHOULD BE EMPHASIZED THAT THE MIRROR ERROR WITH WHICH WE DEAL HERE IS NOT A DIMEN- SIONAL ERROR, BUT RATHER AN ERROR IN THE ANGLE OF THE PLANE

In all three semantic fluency tests (animal, food item, and action), the same three temporal parameters (number of silent pauses, average length of silent pauses, average

A statistical analysis quantified the impact of the observer structure and model type on the performance of the observers in terms of root-mean-square error, mean absolute error,

ABSTRACT This paper outlines the history and results of the anthropological analysis of the population of the Central Danubian Basin ranging from Roman Period to the 9 th century

One of our other paper 14 deals with the affine invariant feature detectors but only the location error has been compared in that paper, the affine invariant trans- formation was

The results of the two algorithms are compared in terms of maximum peak level, beam width and side lobe suppression, to see the effects of the angular error.. For this