LiDAR, IMU, and camera fusion for simultaneous localization and mapping: a systematic review

Zheng Fan, Lele Zhang, Xueyi Wang, Yilan Shen, Fang Deng*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Simultaneous Localization and Mapping (SLAM) is a crucial technology for intelligent unnamed systems to estimate their motion and reconstruct unknown environments. However, the SLAM systems with merely one sensor have poor robustness and stability due to the defects in the sensor itself. Recent studies have demonstrated that SLAM systems with multiple sensors, mainly consisting of LiDAR, camera, and IMU, achieve better performance due to the mutual compensation of different sensors. This paper investigates recent progress on multi-sensor fusion SLAM. The review includes a systematic analysis of the advantages and disadvantages of different sensors and the imperative of multi-sensor solutions. It categorizes multi-sensor fusion SLAM systems into four main types by the fused sensors: LiDAR-IMU SLAM, Visual-IMU SLAM, LiDAR-Visual SLAM, and LiDAR-IMU-Visual SLAM, with detailed analysis and discussions of their pipelines and principles. Meanwhile, the paper surveys commonly used datasets and introduces evaluation metrics. Finally, it concludes with a summary of the existing challenges and future opportunities for multi-sensor fusion SLAM.

Original languageEnglish
Article number174
JournalArtificial Intelligence Review
Volume58
Issue number6
DOIs
Publication statusPublished - Jun 2025

Keywords

  • Camera
  • IMU
  • LiDAR
  • Multi-sensor
  • SLAM

Fingerprint

Dive into the research topics of 'LiDAR, IMU, and camera fusion for simultaneous localization and mapping: a systematic review'. Together they form a unique fingerprint.

Cite this