TY - JOUR
T1 - Multi-Sensor Fusion and Cooperative Perception for Autonomous Driving
T2 - A Review
AU - Xiang, Chao
AU - Feng, Chen
AU - Xie, Xiaopo
AU - Shi, Botian
AU - Lu, Hao
AU - Lv, Yisheng
AU - Yang, Mingchuan
AU - Niu, Zhendong
N1 - Publisher Copyright:
© 2009-2012 IEEE.
PY - 2023/9/1
Y1 - 2023/9/1
N2 - Autonomous driving (AD), including single-vehicle intelligent AD and vehicle-infrastructure cooperative AD, has become a current research hot spot in academia and industry, and multi-sensor fusion is a fundamental task for AD system perception. However, the multi-sensor fusion process faces the problem of differences in the type and dimensionality of sensory data acquired using different sensors (cameras, lidar, millimeter-wave radar, and so on) as well as differences in the performance of environmental perception caused by using different fusion strategies. In this article, we study multiple papers on multi-sensor fusion in the field of AD and address the problem that the category division in current multi-sensor fusion perception is not detailed and clear enough and is more subjective, which makes the classification strategies differ significantly among similar algorithms. We innovatively propose a multi-sensor fusion taxonomy, which divides the fusion perception classification strategies into two categories - symmetric fusion and asymmetric fusion - and seven subcategories of strategy combinations, such as data, features, and results. In addition, the reliability of current AD perception is limited by its insufficient environment perception capability and the robustness of data-driven methods in dealing with extreme situations (e.g., blind areas). This article also summarizes the innovative applications of multi-sensor fusion classification strategies in AD cooperative perception.
AB - Autonomous driving (AD), including single-vehicle intelligent AD and vehicle-infrastructure cooperative AD, has become a current research hot spot in academia and industry, and multi-sensor fusion is a fundamental task for AD system perception. However, the multi-sensor fusion process faces the problem of differences in the type and dimensionality of sensory data acquired using different sensors (cameras, lidar, millimeter-wave radar, and so on) as well as differences in the performance of environmental perception caused by using different fusion strategies. In this article, we study multiple papers on multi-sensor fusion in the field of AD and address the problem that the category division in current multi-sensor fusion perception is not detailed and clear enough and is more subjective, which makes the classification strategies differ significantly among similar algorithms. We innovatively propose a multi-sensor fusion taxonomy, which divides the fusion perception classification strategies into two categories - symmetric fusion and asymmetric fusion - and seven subcategories of strategy combinations, such as data, features, and results. In addition, the reliability of current AD perception is limited by its insufficient environment perception capability and the robustness of data-driven methods in dealing with extreme situations (e.g., blind areas). This article also summarizes the innovative applications of multi-sensor fusion classification strategies in AD cooperative perception.
UR - http://www.scopus.com/inward/record.url?scp=85166778820&partnerID=8YFLogxK
U2 - 10.1109/MITS.2023.3283864
DO - 10.1109/MITS.2023.3283864
M3 - Review article
AN - SCOPUS:85166778820
SN - 1939-1390
VL - 15
SP - 36
EP - 58
JO - IEEE Intelligent Transportation Systems Magazine
JF - IEEE Intelligent Transportation Systems Magazine
IS - 5
ER -