dc.contributor.author | Cáceres Hernández, Danilo | |
dc.contributor.author | Dung Hoang, Van | |
dc.contributor.author | Hyun Jo, Kang | |
dc.contributor.author | Ha Le, My | |
dc.date.accessioned | 2018-06-28T20:59:23Z | |
dc.date.accessioned | 2018-06-28T20:59:23Z | |
dc.date.available | 2018-06-28T20:59:23Z | |
dc.date.available | 2018-06-28T20:59:23Z | |
dc.date.issued | 11/10/2013 | |
dc.date.issued | 11/10/2013 | |
dc.identifier | https://ieeexplore.ieee.org/abstract/document/6700032/ | |
dc.identifier.issn | 1553-572X | |
dc.identifier.uri | http://ridda2.utp.ac.pa/handle/123456789/5086 | |
dc.identifier.uri | http://ridda2.utp.ac.pa/handle/123456789/5086 | |
dc.description | Abstract:
This paper describes a method for localization estimation based on Extended Kalman filter using an omnidirectional camera and a laser rangefinder. Laser rangefinder information is used for predicting absolute motion of the vehicle. The geometric constraint of sequence pairwise omnidirectional images is used to correct the error and construct the mapping. The advantage of omnidirectional camera is a large of field-of-view, which is helpful for long distance tracking feature landmarks. For motion estimation based on vision, the absolute translation of vehicle is approximated posterior information at previous step. The structure from motion based on bearing and range sensors can yield the corrected local position at short distance of movements but it will be accumulative errors overtime. To utilize the advantages of two sensors, Extended Kalman Filter framework is applied for integrating multiple sensors for localization estimation. The experiments were carried out using an electric vehicle with the omnidirectional camera mounted on the roof and the laser device mounted on the bumper. The simulation results will demonstrate the effectiveness of this method from large field-of-view scene images of outdoor environment | en_US |
dc.description.abstract | Abstract:
This paper describes a method for localization estimation based on Extended Kalman filter using an omnidirectional camera and a laser rangefinder. Laser rangefinder information is used for predicting absolute motion of the vehicle. The geometric constraint of sequence pairwise omnidirectional images is used to correct the error and construct the mapping. The advantage of omnidirectional camera is a large of field-of-view, which is helpful for long distance tracking feature landmarks. For motion estimation based on vision, the absolute translation of vehicle is approximated posterior information at previous step. The structure from motion based on bearing and range sensors can yield the corrected local position at short distance of movements but it will be accumulative errors overtime. To utilize the advantages of two sensors, Extended Kalman Filter framework is applied for integrating multiple sensors for localization estimation. The experiments were carried out using an electric vehicle with the omnidirectional camera mounted on the roof and the laser device mounted on the bumper. The simulation results will demonstrate the effectiveness of this method from large field-of-view scene images of outdoor environment | en_US |
dc.format | application/pdf | |
dc.format | text/html | |
dc.language | eng | |
dc.rights | info:eu-repo/semantics/embargoedAccess | |
dc.subject | Cameras | en_US |
dc.subject | Global Positioning System | en_US |
dc.subject | Vehicles | en_US |
dc.subject | Estimation | en_US |
dc.subject | Mirrors | en_US |
dc.subject | Sensors | en_US |
dc.subject | Trajectory | en_US |
dc.subject | Cameras | |
dc.subject | Global Positioning System | |
dc.subject | Vehicles | |
dc.subject | Estimation | |
dc.subject | Mirrors | |
dc.subject | Sensors | |
dc.subject | Trajectory | |
dc.title | Localization estimation based on Extended Kalman filter using multiple sensors | en_US |
dc.type | info:eu-repo/semantics/article | |
dc.type | info:eu-repo/semantics/publishedVersion | |