Real Time Camera Homography Error Modelling For Sensorial Data Fusion
Abstract
The camera homography estimation involves using complex algorithms. They provide good results after some interactions, however the error model provided for such algorithm seems to be no appropriate for sensorial data fusion in real time. In this work a new methodology is proposed for camera homography and IMU data fusion. The experiment consists in rotate the camera. The experiment is done for frames. Here it is presented results about the error in homography estimation with and without bias compensation. The last section concerns about the experimental results and conclusions about this approach.