TY - JOUR
T1 - Targetless Extrinsic Calibration of Camera and Low-Resolution 3-D LiDAR
AU - Ou, Ni
AU - Cai, Hanyu
AU - Yang, Jiawen
AU - Wang, Junzheng
N1 - Publisher Copyright:
© 2001-2012 IEEE.
PY - 2023/5/15
Y1 - 2023/5/15
N2 - Autonomous driving heavily relies on light detection and ranging (LiDAR) and camera sensors, which can significantly improve the performance of perception and navigation tasks when fused together. The success of this cross-modality fusion hinges on accurate extrinsic calibration. In recent years, targetless LiDAR-camera calibration methods have gained increasing attention, thanks to their independence from external targets. Nevertheless, developing a targetless method for low-resolution LiDARs remains challenging due to the difficulty in extracting reliable features from point clouds with limited LiDAR beams. In this article, we propose a robust targetless method to solve this struggling problem. It can automatically estimate accurate LiDAR and camera poses and solve the extrinsic matrix through hand-eye calibration. Moreover, we also carefully analyze pose estimation issues existing in the low-resolution LiDAR and present our solution. Real-world experiments are carried out on an unmanned ground vehicle (UGV)-mounted multisensor platform containing a charged-coupled device (CCD) camera and a VLP-16 LiDAR. For evaluation, we use a state-of-the-art target-based calibration approach to generate the ground truth extrinsic parameters. Experimental results demonstrate that our method achieves low calibration error in both translation (3 cm) and rotation (0.59°).
AB - Autonomous driving heavily relies on light detection and ranging (LiDAR) and camera sensors, which can significantly improve the performance of perception and navigation tasks when fused together. The success of this cross-modality fusion hinges on accurate extrinsic calibration. In recent years, targetless LiDAR-camera calibration methods have gained increasing attention, thanks to their independence from external targets. Nevertheless, developing a targetless method for low-resolution LiDARs remains challenging due to the difficulty in extracting reliable features from point clouds with limited LiDAR beams. In this article, we propose a robust targetless method to solve this struggling problem. It can automatically estimate accurate LiDAR and camera poses and solve the extrinsic matrix through hand-eye calibration. Moreover, we also carefully analyze pose estimation issues existing in the low-resolution LiDAR and present our solution. Real-world experiments are carried out on an unmanned ground vehicle (UGV)-mounted multisensor platform containing a charged-coupled device (CCD) camera and a VLP-16 LiDAR. For evaluation, we use a state-of-the-art target-based calibration approach to generate the ground truth extrinsic parameters. Experimental results demonstrate that our method achieves low calibration error in both translation (3 cm) and rotation (0.59°).
KW - Camera
KW - light detection and ranging (LiDAR)
KW - pose graph optimization
KW - sensor calibration
KW - simultaneous localization and mapping (SLAM)
UR - http://www.scopus.com/inward/record.url?scp=85153341110&partnerID=8YFLogxK
U2 - 10.1109/JSEN.2023.3263833
DO - 10.1109/JSEN.2023.3263833
M3 - Article
AN - SCOPUS:85153341110
SN - 1530-437X
VL - 23
SP - 10889
EP - 10899
JO - IEEE Sensors Journal
JF - IEEE Sensors Journal
IS - 10
ER -