TY - GEN
T1 - Dynamic object tracking for self-driving cars using monocular camera and LIDAR
AU - Zhao, Lin
AU - Wang, Meiling
AU - Su, Sheng
AU - Liu, Tong
AU - Yang, Yi
N1 - Publisher Copyright:
© 2020 IEEE.
PY - 2020/10/24
Y1 - 2020/10/24
N2 - The detection and tracking of dynamic traffic participants (e.g., pedestrians, cars, and bicyclists) plays an important role in reliable decision-making and intelligent navigation for autonomous vehicles. However, due to the rapid movement of the target, most current vision-based tracking methods, which perform tracking in the image domain or invoke 3D information in parts of their pipeline, have real-life limitations such as lack of the ability to recover tracking after the target is lost. In this work, we overcome such limitations and propose a complete system for dynamic object tracking in 3D space that combines: (1) a 3D position tracking algorithm based on monocular camera and LIDAR for the dynamic object; (2) a re-tracking mechanism (RTM) that restore tracking when the target reappears in camera's field of view. Compared with the existing methods, each sensor in our method is capable of performing its role to preserve reliability, and further extending its functions through a novel multimodality fusion module. We perform experiments in the real-world self-driving environment and achieve a desired 10Hz update rate for real-time performance. Our quantitative and qualitative analysis shows that this system is reliable for dynamic object tracking purposes of self-driving cars.
AB - The detection and tracking of dynamic traffic participants (e.g., pedestrians, cars, and bicyclists) plays an important role in reliable decision-making and intelligent navigation for autonomous vehicles. However, due to the rapid movement of the target, most current vision-based tracking methods, which perform tracking in the image domain or invoke 3D information in parts of their pipeline, have real-life limitations such as lack of the ability to recover tracking after the target is lost. In this work, we overcome such limitations and propose a complete system for dynamic object tracking in 3D space that combines: (1) a 3D position tracking algorithm based on monocular camera and LIDAR for the dynamic object; (2) a re-tracking mechanism (RTM) that restore tracking when the target reappears in camera's field of view. Compared with the existing methods, each sensor in our method is capable of performing its role to preserve reliability, and further extending its functions through a novel multimodality fusion module. We perform experiments in the real-world self-driving environment and achieve a desired 10Hz update rate for real-time performance. Our quantitative and qualitative analysis shows that this system is reliable for dynamic object tracking purposes of self-driving cars.
UR - http://www.scopus.com/inward/record.url?scp=85102396538&partnerID=8YFLogxK
U2 - 10.1109/IROS45743.2020.9341179
DO - 10.1109/IROS45743.2020.9341179
M3 - Conference contribution
AN - SCOPUS:85102396538
T3 - IEEE International Conference on Intelligent Robots and Systems
SP - 10865
EP - 10872
BT - 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2020
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2020
Y2 - 24 October 2020 through 24 January 2021
ER -