TY - GEN
T1 - A real-time high-precision visual-inertial SLAM algorithm in dynamic environments
AU - Xing, Jingyao
AU - Wang, Bo
AU - Yin, Zhaojie
N1 - Publisher Copyright:
© 2025 IEEE.
PY - 2025
Y1 - 2025
N2 - Visual simultaneous localization and mapping technology has achieved promising results in static scenes, yet it still encounters significant challenges in dynamic scenes. A prevalent approach in dynamic SLAM is to detect potential dynamic objects in the environment and eliminate dynamic features using deep learning. Unfortunately, deep learning algorithms relying on pixel-level object segmentation pose challenges for real-time operation. While methods combining deep-learning based object detection with multi-view geometry can achieve real-time performance, those based on multi-view geometry often presuppose precise camera poses. Given the difficulty in balancing accuracy and real-time performance of existing methods in dynamic environments, this paper introduces a high-precision and real-time visual-inertial SLAM algorithm. Our proposed algorithm employs multiple threads running in parallel, including object detection and dynamic object classification, feature tracking, local state optimization. Within the object detection and dynamic object classification thread, detected objects are categorized based on their relationship with prior dynamic objects, and further segmentation of dynamic objects within this class is performed using depth information. We propose an optical flow tracking method to robustly track dynamic objects, addressing situations of missed detections. Experiments conducted on the TUM RGB-D public dataset demonstrate that, with GPU acceleration, our proposed algorithm achieves a balance between accuracy and real-time performance in dynamic environments.
AB - Visual simultaneous localization and mapping technology has achieved promising results in static scenes, yet it still encounters significant challenges in dynamic scenes. A prevalent approach in dynamic SLAM is to detect potential dynamic objects in the environment and eliminate dynamic features using deep learning. Unfortunately, deep learning algorithms relying on pixel-level object segmentation pose challenges for real-time operation. While methods combining deep-learning based object detection with multi-view geometry can achieve real-time performance, those based on multi-view geometry often presuppose precise camera poses. Given the difficulty in balancing accuracy and real-time performance of existing methods in dynamic environments, this paper introduces a high-precision and real-time visual-inertial SLAM algorithm. Our proposed algorithm employs multiple threads running in parallel, including object detection and dynamic object classification, feature tracking, local state optimization. Within the object detection and dynamic object classification thread, detected objects are categorized based on their relationship with prior dynamic objects, and further segmentation of dynamic objects within this class is performed using depth information. We propose an optical flow tracking method to robustly track dynamic objects, addressing situations of missed detections. Experiments conducted on the TUM RGB-D public dataset demonstrate that, with GPU acceleration, our proposed algorithm achieves a balance between accuracy and real-time performance in dynamic environments.
KW - Dynamic Environment
KW - Dynamic Object Tracking
KW - Object Detection Based on Deep Learning
KW - Visual-Inertial SLAM
UR - http://www.scopus.com/inward/record.url?scp=105004728039&partnerID=8YFLogxK
U2 - 10.1109/EECT64505.2025.10966983
DO - 10.1109/EECT64505.2025.10966983
M3 - Conference contribution
AN - SCOPUS:105004728039
T3 - EECT 2025 - 2025 5th International Conference on Advances in Electrical, Electronics and Computing Technology, Proceeding
BT - EECT 2025 - 2025 5th International Conference on Advances in Electrical, Electronics and Computing Technology, Proceeding
A2 - Zhu, Jizhong
A2 - Tseng, King Jet
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 5th International Conference on Advances in Electrical, Electronics and Computing Technology, EECT 2025
Y2 - 21 March 2025 through 23 March 2025
ER -