大尺度弱纹理场景下多源信息融合SLAM算法

Translated title of the contribution: Multi-source Information Fusion SLAM Algorithm in Large-scale Weak Texture Scenes

Ye Qing Zhu, Rui Jin, Liang Yu Zhao*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

6 Citations (Scopus)

Abstract

In order to obtain the local accurate and global drift-free state estimation of an autonomous robot in the large-scale weak texture scenes, a SLAM system based on fusion of visual-inertial and global navigation satellite system (GNSS) is proposed. Firstly, by adding the line features to the local state estimation to represent the geometric structure information of the environment, the accuracy of the relative pose estimation between key frames in the weak texture scene is effectively improved. Secondly, by introducing a linear error representation, the linear feature is represented as a linear constraint on the end of the line, so the line feature is integrated into the linear representation based on the feature point algorithm, which effectively improves the robustness of the algorithm in the scene of the repeated line features. Finally, the multi-source information fusion algorithm is used to fuse the visual inertial and GNSS measurement information to achieve the local accurate and global drift free pose estimation, which effectively solves the problem of accurate state estimation in the large-scale weak texture scene. The evaluation results of several common datasets show that the proposed algorithm has stronger robustness and higher positioning accuracy.

Translated title of the contributionMulti-source Information Fusion SLAM Algorithm in Large-scale Weak Texture Scenes
Original languageChinese (Traditional)
Pages (from-to)1271-1282
Number of pages12
JournalYuhang Xuebao/Journal of Astronautics
Volume42
Issue number10
DOIs
Publication statusPublished - 30 Oct 2021

Fingerprint

Dive into the research topics of 'Multi-source Information Fusion SLAM Algorithm in Large-scale Weak Texture Scenes'. Together they form a unique fingerprint.

Cite this