TY - GEN
T1 - A Lightweight Centralized Collaborative Visual-Inertial SLAM
AU - Ding, Meng
AU - Wei, Chao
AU - Qian, Xinhao
AU - Feng, Fuyong
AU - Zhang, Ruijie
AU - Li, Lantao
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - An individual unmanned platform driving autonomously in complex environments usually encounters problems, such as limited perception range, low mapping efficiency, and poor robustness, making it difficult to consistently and accurately complete SLAM tasks. Collaborative SLAM, where multiple unmanned platforms work together to simultaneously localize and map, can realize high efficiency, robustness, and accuracy SLAM in large environments. This paper proposes a lightweight centralized collaborative visual-inertial SLAM strategy. Each sub-unmanned platform independently operates Visual Inertial Odometry and shares keyframes and map information with the central platform. The central platform utilizes the data contributed by multiple sub-unmanned platforms to establish accurate collaborative pose estimation and a consistent global environment map, optimizing the collaborative estimation through position identification, data association, global optimization. In addition, we lightweight the SLAM scale by removing redundant data to achieve accurate and efficient visual-inertial SLAM. Extensive evaluations on multiple public open-source datasets demonstrate that the proposed algorithm exhibits excellent localization accuracy, robustness and scalability.
AB - An individual unmanned platform driving autonomously in complex environments usually encounters problems, such as limited perception range, low mapping efficiency, and poor robustness, making it difficult to consistently and accurately complete SLAM tasks. Collaborative SLAM, where multiple unmanned platforms work together to simultaneously localize and map, can realize high efficiency, robustness, and accuracy SLAM in large environments. This paper proposes a lightweight centralized collaborative visual-inertial SLAM strategy. Each sub-unmanned platform independently operates Visual Inertial Odometry and shares keyframes and map information with the central platform. The central platform utilizes the data contributed by multiple sub-unmanned platforms to establish accurate collaborative pose estimation and a consistent global environment map, optimizing the collaborative estimation through position identification, data association, global optimization. In addition, we lightweight the SLAM scale by removing redundant data to achieve accurate and efficient visual-inertial SLAM. Extensive evaluations on multiple public open-source datasets demonstrate that the proposed algorithm exhibits excellent localization accuracy, robustness and scalability.
KW - collaborative SLAM
KW - multi-unmanned platform
KW - simultaneous localization and mapping
KW - visual-inertial odometry
UR - http://www.scopus.com/inward/record.url?scp=85218025091&partnerID=8YFLogxK
U2 - 10.1109/ICUS61736.2024.10839801
DO - 10.1109/ICUS61736.2024.10839801
M3 - Conference contribution
AN - SCOPUS:85218025091
T3 - Proceedings of 2024 IEEE International Conference on Unmanned Systems, ICUS 2024
SP - 775
EP - 780
BT - Proceedings of 2024 IEEE International Conference on Unmanned Systems, ICUS 2024
A2 - Song, Rong
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2024 IEEE International Conference on Unmanned Systems, ICUS 2024
Y2 - 18 October 2024 through 20 October 2024
ER -