Abstract
Accurate positioning is crucial for intelligent vehicles, especially in scenarios with spatial constraints such as close vehicle proximities, tight parking spaces, and the docking process of autonomous modular buses (AMBs). Binocular cameras and Light Detection and Ranging (LiDAR) have shown great potential in intelligent vehicle applications. However, existing methods mostly rely on comparing the inter-camera extrinsic matrices and the results of the calibration between a single camera and LiDAR. This not only leads to the accumulation of errors in each process but also fails to accurately determine the source of errors when calibration results are suboptimal. To overcome these problems, this paper proposes a high-precision, phased joint calibration method based on binocular cameras and LiDAR, along with a combined global and local evaluation approach, and introduces a visualization scheme to enhance the reliability and intuitiveness of the joint calibration process. Experimental results on AMBs demonstrate that our methodology and selection of intrinsic and extrinsic parameters significantly improve performance compared to other mainstream methods.
Original language | English |
---|---|
Pages (from-to) | 7404-7415 |
Number of pages | 12 |
Journal | IEEE Transactions on Vehicular Technology |
Volume | 74 |
Issue number | 5 |
DOIs | |
Publication status | Published - 2025 |
Externally published | Yes |
Keywords
- Binocular cameras
- LiDAR
- autonomous modular bus (AMB)
- intelligent vehicle
- joint calibration