Semi-Direct Visual Odometry and Mapping System with RGB-D Camera

Xinliang Zhong, Xiao Luo*, Jiaheng Zhao, Yutong Huang

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

In this paper a semi-direct visual odometry and mapping system is proposed with a RGB-D camera, which combines the merits of both feature based and direct based methods. The presented system directly estimates the camera motion of two consecutive RGB-D frames by minimizing the photometric error. To permit outliers and noise, a robust sensor model built upon the t-distribution and an error function mixing depth and photometric errors are used to enhance the accuracy and robustness. Local graph optimization based on key frames is used to reduce the accumulative error and refine the local map. The loop closure detection method, which combines the appearance similarity method and spatial location constraints method, increases the speed of detection. Experimental results demonstrate that the proposed approach achieves higher accuracy on the motion estimation and environment reconstruction compared to the other state-of-the-art methods. Moreover, the proposed approach works in real-time on a laptop without a GPU, which makes it attractive for robots equipped with limited computational resources.

Original languageEnglish
Pages (from-to)83-93
Number of pages11
JournalJournal of Beijing Institute of Technology (English Edition)
Volume28
Issue number1
DOIs
Publication statusPublished - 1 Mar 2019

Keywords

  • 3D mapping
  • Localization
  • Loop closure detection
  • RGB-D simultaneous localization and mapping(SLAM)
  • Visual odometry

Fingerprint

Dive into the research topics of 'Semi-Direct Visual Odometry and Mapping System with RGB-D Camera'. Together they form a unique fingerprint.

Cite this