Accurate and robust monocular SLAM with omnidirectional cameras

Shuoyuan Liu, Peng Guo*, Lihui Feng, Aiying Yang

*此作品的通讯作者

科研成果: 期刊稿件文章同行评审

24 引用 (Scopus)

摘要

Simultaneous localization and mapping (SLAM) are fundamental elements for many emerging technologies, such as autonomous driving and augmented reality. For this paper, to get more information, we developed an improved monocular visual SLAM system by using omnidirectional cameras. Our method extends the ORB-SLAM framework with the enhanced unified camera model as a projection function, which can be applied to catadioptric systems and wide-angle fisheye cameras with 195 degrees field-of-view. The proposed system can use the full area of the images even with strong distortion. For omnidirectional cameras, a map initialization method is proposed. We analytically derive the Jacobian matrices of the reprojection errors with respect to the camera pose and 3D position of points. The proposed SLAM has been extensively tested in real-world datasets. The results show positioning error is less than 0.1% in a small indoor environment and is less than 1.5% in a large environment. The results demonstrate that our method is real-time, and increases its accuracy and robustness over the normal systems based on the pinhole model. We open source in https://github.com/lsyads/fisheye-ORB-SLAM.

源语言英语
文章编号4494
期刊Sensors
19
20
DOI
出版状态已出版 - 2 10月 2019

指纹

探究 'Accurate and robust monocular SLAM with omnidirectional cameras' 的科研主题。它们共同构成独一无二的指纹。

引用此