Show simple item record

dc.contributor.authorCáceres Hernández, Danilo
dc.contributor.authorDung Hoang, Van
dc.contributor.authorHyun Jo, Kang
dc.contributor.authorHa Le, My
dc.date.accessioned2018-06-28T20:59:23Z
dc.date.accessioned2018-06-28T20:59:23Z
dc.date.available2018-06-28T20:59:23Z
dc.date.available2018-06-28T20:59:23Z
dc.date.issued11/10/2013
dc.date.issued11/10/2013
dc.identifierhttps://ieeexplore.ieee.org/abstract/document/6700032/
dc.identifier.issn1553-572X
dc.identifier.urihttp://ridda2.utp.ac.pa/handle/123456789/5086
dc.identifier.urihttp://ridda2.utp.ac.pa/handle/123456789/5086
dc.descriptionAbstract: This paper describes a method for localization estimation based on Extended Kalman filter using an omnidirectional camera and a laser rangefinder. Laser rangefinder information is used for predicting absolute motion of the vehicle. The geometric constraint of sequence pairwise omnidirectional images is used to correct the error and construct the mapping. The advantage of omnidirectional camera is a large of field-of-view, which is helpful for long distance tracking feature landmarks. For motion estimation based on vision, the absolute translation of vehicle is approximated posterior information at previous step. The structure from motion based on bearing and range sensors can yield the corrected local position at short distance of movements but it will be accumulative errors overtime. To utilize the advantages of two sensors, Extended Kalman Filter framework is applied for integrating multiple sensors for localization estimation. The experiments were carried out using an electric vehicle with the omnidirectional camera mounted on the roof and the laser device mounted on the bumper. The simulation results will demonstrate the effectiveness of this method from large field-of-view scene images of outdoor environmenten_US
dc.description.abstractAbstract: This paper describes a method for localization estimation based on Extended Kalman filter using an omnidirectional camera and a laser rangefinder. Laser rangefinder information is used for predicting absolute motion of the vehicle. The geometric constraint of sequence pairwise omnidirectional images is used to correct the error and construct the mapping. The advantage of omnidirectional camera is a large of field-of-view, which is helpful for long distance tracking feature landmarks. For motion estimation based on vision, the absolute translation of vehicle is approximated posterior information at previous step. The structure from motion based on bearing and range sensors can yield the corrected local position at short distance of movements but it will be accumulative errors overtime. To utilize the advantages of two sensors, Extended Kalman Filter framework is applied for integrating multiple sensors for localization estimation. The experiments were carried out using an electric vehicle with the omnidirectional camera mounted on the roof and the laser device mounted on the bumper. The simulation results will demonstrate the effectiveness of this method from large field-of-view scene images of outdoor environmenten_US
dc.formatapplication/pdf
dc.formattext/html
dc.languageeng
dc.rightsinfo:eu-repo/semantics/embargoedAccess
dc.subjectCamerasen_US
dc.subjectGlobal Positioning Systemen_US
dc.subjectVehiclesen_US
dc.subjectEstimationen_US
dc.subjectMirrorsen_US
dc.subjectSensorsen_US
dc.subjectTrajectoryen_US
dc.subjectCameras
dc.subjectGlobal Positioning System
dc.subjectVehicles
dc.subjectEstimation
dc.subjectMirrors
dc.subjectSensors
dc.subjectTrajectory
dc.titleLocalization estimation based on Extended Kalman filter using multiple sensorsen_US
dc.typeinfo:eu-repo/semantics/article
dc.typeinfo:eu-repo/semantics/publishedVersion


Files in this item

FilesSizeFormatView

There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record