TY - GEN
T1 - LL-SLAM
T2 - 42nd Chinese Control Conference, CCC 2023
AU - Wang, Aobo
AU - Zhong, Rui
AU - Zheng, Kefan
AU - Fang, Hao
N1 - Publisher Copyright:
© 2023 Technical Committee on Control Theory, Chinese Association of Automation.
PY - 2023
Y1 - 2023
N2 - This paper presents LL-SLAM, a lightweight visual-inertial simultaneous localization and mapping (SLAM) system based on the loosely coupled method for autonomous flight and navigation tasks of the unmanned aerial vehicle (UAV). LL-SLAM consists of the stereo visual pose estimation module (visual module) and the EKF-based inertial pose estimation module (inertial module), which have complementary strengths. LL-SLAM integrates the poses of two modules into an accurate and robust pose estimation according to the tracking status using the loosely coupled poses integration algorithm. The system innovatively uses inertial poses to provide prior poses for visual module and uses visual poses to provide feedback for inertial module. The system innovatively proposes the adaptive feature adjustment algorithm, which effectively solves the problem between accuracy and computational cost. The characteristics of LL-SLAM, such as computational efficiency, excellent robustness, absolute trajectory scale, rapid initialization, and high accuracy, can make the system more suitable for UAV flights. We evaluate our system on public benchmarks and UAV flights for pose estimation and time cost compared to other state-of-the-art SLAM systems. In addition, experiments on complex UAV flight tasks show that our system can favorably meet the needs of UAV Autonomous Navigation.
AB - This paper presents LL-SLAM, a lightweight visual-inertial simultaneous localization and mapping (SLAM) system based on the loosely coupled method for autonomous flight and navigation tasks of the unmanned aerial vehicle (UAV). LL-SLAM consists of the stereo visual pose estimation module (visual module) and the EKF-based inertial pose estimation module (inertial module), which have complementary strengths. LL-SLAM integrates the poses of two modules into an accurate and robust pose estimation according to the tracking status using the loosely coupled poses integration algorithm. The system innovatively uses inertial poses to provide prior poses for visual module and uses visual poses to provide feedback for inertial module. The system innovatively proposes the adaptive feature adjustment algorithm, which effectively solves the problem between accuracy and computational cost. The characteristics of LL-SLAM, such as computational efficiency, excellent robustness, absolute trajectory scale, rapid initialization, and high accuracy, can make the system more suitable for UAV flights. We evaluate our system on public benchmarks and UAV flights for pose estimation and time cost compared to other state-of-the-art SLAM systems. In addition, experiments on complex UAV flight tasks show that our system can favorably meet the needs of UAV Autonomous Navigation.
KW - EKF
KW - Lightweight Pose Estimation
KW - UAV Autonomous Navigation
KW - Visual-Inertial SLAM
UR - http://www.scopus.com/inward/record.url?scp=85175538463&partnerID=8YFLogxK
U2 - 10.23919/CCC58697.2023.10239955
DO - 10.23919/CCC58697.2023.10239955
M3 - Conference contribution
AN - SCOPUS:85175538463
T3 - Chinese Control Conference, CCC
SP - 3632
EP - 3637
BT - 2023 42nd Chinese Control Conference, CCC 2023
PB - IEEE Computer Society
Y2 - 24 July 2023 through 26 July 2023
ER -