TY - GEN
T1 - L2V2T2Calib
T2 - 34th IEEE Intelligent Vehicles Symposium, IV 2023
AU - Zhang, Jun
AU - Liu, Yiyao
AU - Wen, Mingxing
AU - Yue, Yufeng
AU - Zhang, Haoyuan
AU - Wang, Danwei
N1 - Publisher Copyright:
© 2023 IEEE.
PY - 2023
Y1 - 2023
N2 - Extrinsic calibration between LiDAR-Camera and LiDAR-LiDAR has been researched extensively, because it is the foundation for sensor fusion. Meanwhile, many projects are open-sourced and significantly promote related research. However, limited solutions can unify the calibration between repetitive scanning and non-repetitive scanning 3D LiDAR, sparse and dense 3D LiDAR, visual and thermal camera. Currently, to achieve that, we normally need to use different targets and extract different features for different sensor combinations. Sometimes, human intervention is required to locate the target. It is inconvenient and time-consuming. In this paper, L2V2T2Calib is introduced and open-sourced as a trial to unify the calibration. 1). A four-circular-holes board is adopted for all sensors. The four circle centers can be detected by all the sensors, thus are ideal common features. Previous works also use this target, but the algorithms don't consider non-repetitive scanning LiDARs, thus cannot be directly applied. 2). To unify the process, an important step is to automatically and robustly detect the target from different types of LiDARs. However, this does not receive enough attention. We propose a method based on template matching. It is simple, but effective and general to different depth sensors. 3). We provide two types of output, minimizing 2D re-projection error (Min2D) and minimizing 3D matching error (Min3D), for different users. And their performance is compared. Extensive experiments conducted in both simulation and real environment demonstrate L2V2T2Calib is accurate, robust, more importantly, unified. The code will be open-sourced to promote related research at: https://github.com/Clothooo/lvt2calib
AB - Extrinsic calibration between LiDAR-Camera and LiDAR-LiDAR has been researched extensively, because it is the foundation for sensor fusion. Meanwhile, many projects are open-sourced and significantly promote related research. However, limited solutions can unify the calibration between repetitive scanning and non-repetitive scanning 3D LiDAR, sparse and dense 3D LiDAR, visual and thermal camera. Currently, to achieve that, we normally need to use different targets and extract different features for different sensor combinations. Sometimes, human intervention is required to locate the target. It is inconvenient and time-consuming. In this paper, L2V2T2Calib is introduced and open-sourced as a trial to unify the calibration. 1). A four-circular-holes board is adopted for all sensors. The four circle centers can be detected by all the sensors, thus are ideal common features. Previous works also use this target, but the algorithms don't consider non-repetitive scanning LiDARs, thus cannot be directly applied. 2). To unify the process, an important step is to automatically and robustly detect the target from different types of LiDARs. However, this does not receive enough attention. We propose a method based on template matching. It is simple, but effective and general to different depth sensors. 3). We provide two types of output, minimizing 2D re-projection error (Min2D) and minimizing 3D matching error (Min3D), for different users. And their performance is compared. Extensive experiments conducted in both simulation and real environment demonstrate L2V2T2Calib is accurate, robust, more importantly, unified. The code will be open-sourced to promote related research at: https://github.com/Clothooo/lvt2calib
UR - http://www.scopus.com/inward/record.url?scp=85167995303&partnerID=8YFLogxK
U2 - 10.1109/IV55152.2023.10186657
DO - 10.1109/IV55152.2023.10186657
M3 - Conference contribution
AN - SCOPUS:85167995303
T3 - IEEE Intelligent Vehicles Symposium, Proceedings
BT - IV 2023 - IEEE Intelligent Vehicles Symposium, Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 4 June 2023 through 7 June 2023
ER -