Large scale egomotion and error analysis with visual features

Miguel Ángel Cazorla Quevedo, Diego Viejo Hernando, Andrés Hernández Gutiérrez, Juan Nieto, Eduardo Nebot

Abstract

Several works deal with 3D data in SLAM problem but many of them are focused on short scale maps. In this paper, we propose a method that can be used for computing the 6DoF trajectory performed by a robot from the stereo images captured during a large scale trajectory. The method transforms robust 2D features extracted from the reference stereo images to the 3D space. These 3D features are then used for obtaining the correct robot movement. Both Sift and Surf methods for feature extraction have been used. Also, a comparison between our method and the results of the ICP algorithm have been performed. We have also made a study about errors in stereo cameras.

Keywords

Computer vision; Mobile robotics



DOI: https://doi.org/10.14198/JoPha.2010.4.1.04