TY - GEN
T1 - PLPD-SLAM
T2 - 18th IEEE International Conference on Control and Automation, ICCA 2024
AU - Dong, Juan
AU - Lu, Maobin
AU - Xu, Yong
AU - Deng, Fang
AU - Chen, Jie
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - The majority of visual Simultaneous Localization and Mapping (SLAM) algorithms are built upon the assumption of static environmental conditions. However, this assumption limits the applicability of visual SLAM systems in real-world scenarios. Some methods use deep learning-based image segmentation to remove dynamic objects before tracking, which slows down tracking. Others relying on object detection end up with few point features left after removing those on dynamic objects, causing significant drift in the tracking trajectory. In this paper, we propose a dynamic SLAM method based on point-line-plane features. We calculate the information entropy to determine the distribution complexity of the pixels. If the information is sufficient, only point and line features are used for trajectory tracking, otherwise, plane features are added. We employ YOLOv5 for dynamic object detection, enabling robust tracking in dynamic scenarios by selecting reliable features. By performing object detection and tracking in parallel, we improve the real-time performance of the system. In sharp contrast to existing methods, the PLPD-SLAM can handle environments with dynamic objects in real-time and significantly reduce the long-term drift caused by dynamic objects. Finally, we evaluate our method using public benchmarks and our dynamic laboratory scenarios. The experimental results show that our method performs better compared to other state-of-the-art methods.
AB - The majority of visual Simultaneous Localization and Mapping (SLAM) algorithms are built upon the assumption of static environmental conditions. However, this assumption limits the applicability of visual SLAM systems in real-world scenarios. Some methods use deep learning-based image segmentation to remove dynamic objects before tracking, which slows down tracking. Others relying on object detection end up with few point features left after removing those on dynamic objects, causing significant drift in the tracking trajectory. In this paper, we propose a dynamic SLAM method based on point-line-plane features. We calculate the information entropy to determine the distribution complexity of the pixels. If the information is sufficient, only point and line features are used for trajectory tracking, otherwise, plane features are added. We employ YOLOv5 for dynamic object detection, enabling robust tracking in dynamic scenarios by selecting reliable features. By performing object detection and tracking in parallel, we improve the real-time performance of the system. In sharp contrast to existing methods, the PLPD-SLAM can handle environments with dynamic objects in real-time and significantly reduce the long-term drift caused by dynamic objects. Finally, we evaluate our method using public benchmarks and our dynamic laboratory scenarios. The experimental results show that our method performs better compared to other state-of-the-art methods.
UR - http://www.scopus.com/inward/record.url?scp=85200367680&partnerID=8YFLogxK
U2 - 10.1109/ICCA62789.2024.10591805
DO - 10.1109/ICCA62789.2024.10591805
M3 - Conference contribution
AN - SCOPUS:85200367680
T3 - IEEE International Conference on Control and Automation, ICCA
SP - 719
EP - 724
BT - 2024 IEEE 18th International Conference on Control and Automation, ICCA 2024
PB - IEEE Computer Society
Y2 - 18 June 2024 through 21 June 2024
ER -