TY - GEN
T1 - Tightly-coupled Lidar-GNSS-Inertial Fusion Odometry and Mapping
AU - Yu, Shuwei
AU - Li, Jing
AU - Niu, Tianwei
AU - Wang, Junzheng
N1 - Publisher Copyright:
© 2024 Technical Committee on Control Theory, Chinese Association of Automation.
PY - 2024
Y1 - 2024
N2 - This paper proposes a tightly-coupled lidar-GNSS-inertial fusion system that achieves accurate state estimation and mapping for robot navigation. The system effectively fuses lidar points, GNSS measurements, and IMU data using the iterated error state Kalman filter algorithm to obtain precise, drift-free, and real-time odometry. To optimize computational efficiency, an incremental tree structure, ikdtree, is employed for managing a local map, and different Kalman gain formulas are utilized to process GNSS and lidar point observations separately to lower the computation load. Furthermore, a factor graph is introduced to optimize the pose further. By selectively introducing keyframes based on the odometry estimation, the graph incorporates odometry, GNSS measurements, and loop closure constraints to optimize all keyframe poses, resulting in a precise trajectory and global map. Finally, extensive experiments are conducted using the KITTI dataset and several real-world scenarios to validate the proposed approach. The experimental results demonstrate that our method achieves precise localization and mapping across diverse environments.
AB - This paper proposes a tightly-coupled lidar-GNSS-inertial fusion system that achieves accurate state estimation and mapping for robot navigation. The system effectively fuses lidar points, GNSS measurements, and IMU data using the iterated error state Kalman filter algorithm to obtain precise, drift-free, and real-time odometry. To optimize computational efficiency, an incremental tree structure, ikdtree, is employed for managing a local map, and different Kalman gain formulas are utilized to process GNSS and lidar point observations separately to lower the computation load. Furthermore, a factor graph is introduced to optimize the pose further. By selectively introducing keyframes based on the odometry estimation, the graph incorporates odometry, GNSS measurements, and loop closure constraints to optimize all keyframe poses, resulting in a precise trajectory and global map. Finally, extensive experiments are conducted using the KITTI dataset and several real-world scenarios to validate the proposed approach. The experimental results demonstrate that our method achieves precise localization and mapping across diverse environments.
KW - Mapping
KW - Robot navigation
KW - Sensor fusion
KW - State estimation
UR - http://www.scopus.com/inward/record.url?scp=85205447794&partnerID=8YFLogxK
U2 - 10.23919/CCC63176.2024.10662340
DO - 10.23919/CCC63176.2024.10662340
M3 - Conference contribution
AN - SCOPUS:85205447794
T3 - Chinese Control Conference, CCC
SP - 3888
EP - 3893
BT - Proceedings of the 43rd Chinese Control Conference, CCC 2024
A2 - Na, Jing
A2 - Sun, Jian
PB - IEEE Computer Society
T2 - 43rd Chinese Control Conference, CCC 2024
Y2 - 28 July 2024 through 31 July 2024
ER -