TY - GEN
T1 - STL-SLAM
T2 - 2024 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2024
AU - Dong, Juan
AU - Lu, Maobin
AU - Chen, Chen
AU - Deng, Fang
AU - Chen, Jie
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - Most RGB-D-based SLAM methods assume texture-rich environments, making them susceptible to significant tracking errors or complete failures in the absence of texture features. Moreover, many existing methods encounter substantial rotation estimation errors, leading to long-term drift in tracking. This paper proposes a novel structured-constrained RGB-D SLAM method (STL-SLAM) for texture-limited environments. Compared to the existing methods, STL-SLAM can deal with environments without abundant texture information and significantly reduce long-term drift caused by rotation estimation errors. We assess the distribution complexity of pixels in an image by calculating the information entropy and pre-processing accordingly. We also present an efficient Manhattan Frames (MF) detection strategy based on orthogonal planes and lines. If MF is detected, we decouple rotation and translation, estimate drift-free rotation based on the Manhattan World (MW) coordinate system, and then estimate translation by minimizing the re-projection error of point, line, and plane features. In non-Manhattan Frames, the 6-DoF pose estimation is performed holistically, with the incorporation of structural constraints of parallel and perpendicular planes, as well as parallel and vertical lines, into the optimization process. Finally, we evaluate our method on public datasets and in real-world environments, which shows that our proposed method achieves superior performance compared to its counterparts.
AB - Most RGB-D-based SLAM methods assume texture-rich environments, making them susceptible to significant tracking errors or complete failures in the absence of texture features. Moreover, many existing methods encounter substantial rotation estimation errors, leading to long-term drift in tracking. This paper proposes a novel structured-constrained RGB-D SLAM method (STL-SLAM) for texture-limited environments. Compared to the existing methods, STL-SLAM can deal with environments without abundant texture information and significantly reduce long-term drift caused by rotation estimation errors. We assess the distribution complexity of pixels in an image by calculating the information entropy and pre-processing accordingly. We also present an efficient Manhattan Frames (MF) detection strategy based on orthogonal planes and lines. If MF is detected, we decouple rotation and translation, estimate drift-free rotation based on the Manhattan World (MW) coordinate system, and then estimate translation by minimizing the re-projection error of point, line, and plane features. In non-Manhattan Frames, the 6-DoF pose estimation is performed holistically, with the incorporation of structural constraints of parallel and perpendicular planes, as well as parallel and vertical lines, into the optimization process. Finally, we evaluate our method on public datasets and in real-world environments, which shows that our proposed method achieves superior performance compared to its counterparts.
UR - http://www.scopus.com/inward/record.url?scp=85216474815&partnerID=8YFLogxK
U2 - 10.1109/IROS58592.2024.10801465
DO - 10.1109/IROS58592.2024.10801465
M3 - Conference contribution
AN - SCOPUS:85216474815
T3 - IEEE International Conference on Intelligent Robots and Systems
SP - 10850
EP - 10855
BT - 2024 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2024
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 14 October 2024 through 18 October 2024
ER -