TY - GEN
T1 - DOTF-SLAM
T2 - 2023 IEEE International Conference on Unmanned Systems, ICUS 2023
AU - Liu, Yixuan
AU - Zhao, Xuyang
AU - Liu, Zhengmao
AU - Yu, Chengpu
N1 - Publisher Copyright:
© 2023 IEEE.
PY - 2023
Y1 - 2023
N2 - Traditional visual simultaneous localization and mapping (SLAM) algorithms assume static scenes, which limits their application in real-world environments where dynamics are prevalent, such as autonomous driving and multi-robot collaboration. Therefore, clear information about the dynamic environment is needed to aid decision-making and scene understanding. To address the problem, this paper develops a method based on the ORB-SLAM2 framework that is more robust when operating in dynamic environments. In our method, we combine dynamic object tracking, prediction and dynamic feature points filtering to eliminate the influence of dynamic objects on localization and map construction. On the TUM dataset, the algorithm reduces the Absolute Trajectory Error (ATE) by more than 80% compared to ORB-SLAM2, while the improvement in dynamic segments of the KITTI dataset is around 20%. In addition, we achieve a real-time performance of over 15 FPS while localization accuracy is comparable to DynaSLAM and DS-SLAM, which can only achieve approximately 2-3 FPS. According to the experimental results, suggested algorithm can successfully improve localization accuracy in highly dynamic situations.
AB - Traditional visual simultaneous localization and mapping (SLAM) algorithms assume static scenes, which limits their application in real-world environments where dynamics are prevalent, such as autonomous driving and multi-robot collaboration. Therefore, clear information about the dynamic environment is needed to aid decision-making and scene understanding. To address the problem, this paper develops a method based on the ORB-SLAM2 framework that is more robust when operating in dynamic environments. In our method, we combine dynamic object tracking, prediction and dynamic feature points filtering to eliminate the influence of dynamic objects on localization and map construction. On the TUM dataset, the algorithm reduces the Absolute Trajectory Error (ATE) by more than 80% compared to ORB-SLAM2, while the improvement in dynamic segments of the KITTI dataset is around 20%. In addition, we achieve a real-time performance of over 15 FPS while localization accuracy is comparable to DynaSLAM and DS-SLAM, which can only achieve approximately 2-3 FPS. According to the experimental results, suggested algorithm can successfully improve localization accuracy in highly dynamic situations.
KW - dynamic scenario
KW - key-point filtering
KW - object detecting and tracking
KW - visual SLAM
UR - http://www.scopus.com/inward/record.url?scp=85180126683&partnerID=8YFLogxK
U2 - 10.1109/ICUS58632.2023.10318260
DO - 10.1109/ICUS58632.2023.10318260
M3 - Conference contribution
AN - SCOPUS:85180126683
T3 - Proceedings of 2023 IEEE International Conference on Unmanned Systems, ICUS 2023
SP - 257
EP - 262
BT - Proceedings of 2023 IEEE International Conference on Unmanned Systems, ICUS 2023
A2 - Song, Rong
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 13 October 2023 through 15 October 2023
ER -