TY - GEN
T1 - Fault-Tolerant Multi-Sensor Fusion Positioning System for Autonomous Vehicles in Unknown Outdoor Environments
AU - Zhou, Zijie
AU - Zheng, Ying
AU - Ma, Junyi
AU - Xiong, Guangming
N1 - Publisher Copyright:
© 2023 IEEE.
PY - 2023
Y1 - 2023
N2 - The safety of autonomous vehicles relies heavily on accurate and reliable positioning and navigation systems. However, single-sensor based positioning systems are prone to error due to environmental factors such as weather, light, and occlusion. To address this issue, we propose a fault-tolerant multi-sensor fusion positioning system that integrates information from global navigation satellite system (GNSS), inertial navigation system (INS), LiDAR and the camera. The system utilizes a decentralized filtering framework and leverages three parallel subsystems: IμLiDAR, IμCamera and GNSS/INS to accurately estimate the pose of autonomous vehicles in real-time. The LiDAR and the camera subsystems combine high-frequency IMU information to estimate the pose through graph optimization. At the data fusion stage, the uniform motion model and the innovation covariance are exploited for fault diagnosis and isolation of harmful observations. Extended experiments are performed on the KAIST dataset and our self-recorded off-road environments. The experimental results show that our method achieves root mean square errors of 3.85m for average trajectory error over a total length of 11.06km, which indicates that our multi-sensor fusion positioning method can maintain high accuracy and fault tolerance in environments where GNSS is interfered and environmental features are sparse.
AB - The safety of autonomous vehicles relies heavily on accurate and reliable positioning and navigation systems. However, single-sensor based positioning systems are prone to error due to environmental factors such as weather, light, and occlusion. To address this issue, we propose a fault-tolerant multi-sensor fusion positioning system that integrates information from global navigation satellite system (GNSS), inertial navigation system (INS), LiDAR and the camera. The system utilizes a decentralized filtering framework and leverages three parallel subsystems: IμLiDAR, IμCamera and GNSS/INS to accurately estimate the pose of autonomous vehicles in real-time. The LiDAR and the camera subsystems combine high-frequency IMU information to estimate the pose through graph optimization. At the data fusion stage, the uniform motion model and the innovation covariance are exploited for fault diagnosis and isolation of harmful observations. Extended experiments are performed on the KAIST dataset and our self-recorded off-road environments. The experimental results show that our method achieves root mean square errors of 3.85m for average trajectory error over a total length of 11.06km, which indicates that our multi-sensor fusion positioning method can maintain high accuracy and fault tolerance in environments where GNSS is interfered and environmental features are sparse.
KW - autonomous vehicle
KW - fault tolerance
KW - multi-source information fusion
UR - http://www.scopus.com/inward/record.url?scp=85180131952&partnerID=8YFLogxK
U2 - 10.1109/ICUS58632.2023.10318346
DO - 10.1109/ICUS58632.2023.10318346
M3 - Conference contribution
AN - SCOPUS:85180131952
T3 - Proceedings of 2023 IEEE International Conference on Unmanned Systems, ICUS 2023
SP - 81
EP - 86
BT - Proceedings of 2023 IEEE International Conference on Unmanned Systems, ICUS 2023
A2 - Song, Rong
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2023 IEEE International Conference on Unmanned Systems, ICUS 2023
Y2 - 13 October 2023 through 15 October 2023
ER -