Localization estimation based on Extended Kalman filter using multiple sensors
Date
11/10/201311/10/2013
Author
Cáceres Hernández, Danilo
Dung Hoang, Van
Hyun Jo, Kang
Ha Le, My
Metadata
Show full item recordAbstract
Abstract:
This paper describes a method for localization estimation based on Extended Kalman filter using an omnidirectional camera and a laser rangefinder. Laser rangefinder information is used for predicting absolute motion of the vehicle. The geometric constraint of sequence pairwise omnidirectional images is used to correct the error and construct the mapping. The advantage of omnidirectional camera is a large of field-of-view, which is helpful for long distance tracking feature landmarks. For motion estimation based on vision, the absolute translation of vehicle is approximated posterior information at previous step. The structure from motion based on bearing and range sensors can yield the corrected local position at short distance of movements but it will be accumulative errors overtime. To utilize the advantages of two sensors, Extended Kalman Filter framework is applied for integrating multiple sensors for localization estimation. The experiments were carried out using an electric vehicle with the omnidirectional camera mounted on the roof and the laser device mounted on the bumper. The simulation results will demonstrate the effectiveness of this method from large field-of-view scene images of outdoor environment
This paper describes a method for localization estimation based on Extended Kalman filter using an omnidirectional camera and a laser rangefinder. Laser rangefinder information is used for predicting absolute motion of the vehicle. The geometric constraint of sequence pairwise omnidirectional images is used to correct the error and construct the mapping. The advantage of omnidirectional camera is a large of field-of-view, which is helpful for long distance tracking feature landmarks. For motion estimation based on vision, the absolute translation of vehicle is approximated posterior information at previous step. The structure from motion based on bearing and range sensors can yield the corrected local position at short distance of movements but it will be accumulative errors overtime. To utilize the advantages of two sensors, Extended Kalman Filter framework is applied for integrating multiple sensors for localization estimation. The experiments were carried out using an electric vehicle with the omnidirectional camera mounted on the roof and the laser device mounted on the bumper. The simulation results will demonstrate the effectiveness of this method from large field-of-view scene images of outdoor environment