Semi-Direct Visual Odometry and Mapping System with RGB-D Camera

Xinliang Zhong, Xiao Luo*, Jiaheng Zhao, Yutong Huang

*此作品的通讯作者

科研成果: 期刊稿件文章同行评审

摘要

In this paper a semi-direct visual odometry and mapping system is proposed with a RGB-D camera, which combines the merits of both feature based and direct based methods. The presented system directly estimates the camera motion of two consecutive RGB-D frames by minimizing the photometric error. To permit outliers and noise, a robust sensor model built upon the t-distribution and an error function mixing depth and photometric errors are used to enhance the accuracy and robustness. Local graph optimization based on key frames is used to reduce the accumulative error and refine the local map. The loop closure detection method, which combines the appearance similarity method and spatial location constraints method, increases the speed of detection. Experimental results demonstrate that the proposed approach achieves higher accuracy on the motion estimation and environment reconstruction compared to the other state-of-the-art methods. Moreover, the proposed approach works in real-time on a laptop without a GPU, which makes it attractive for robots equipped with limited computational resources.

源语言英语
页(从-至)83-93
页数11
期刊Journal of Beijing Institute of Technology (English Edition)
28
1
DOI
出版状态已出版 - 1 3月 2019

指纹

探究 'Semi-Direct Visual Odometry and Mapping System with RGB-D Camera' 的科研主题。它们共同构成独一无二的指纹。

引用此