Abstract
In the battlefield environment, the perception system of unmanned vehicle is susceptible to the influence of weather such as smoke and dust. The ability to detect and track key objects is greatly reduced under harsh weather conditions, resulting in serious consequences, such as object miss-detection, object misdetection and object missing. To address this problem, a fusion system of MMW radar and infrared camera is developed. The object-level fusion method is adopted to establish simple and effective fusion rules, extract and combine the dominant information from each sensor, and finally output stable objective perception results. The objects of MMW radar are checked and extracted. And an improved DBSCAN clustering algorithm is proposed to reduce the noise of MMW radar. The MobileNetv2 backbone network is introduced based on the YOLOv4 network. In the process of network training, the transfer learning method is used to expand the infrared data samples, which solves the problem of fewer training samples of infrared images. The experimental results show that the fusion algorithm has significantly better accuracy and high real-time performance in the smoke environment compared with the algorithm based on infrared camera only, which realizes the object detection and tracking of the fusion of MMW radar and infrared camera in the smoke environment, and improves the anti-interference ability of the object detection and tracking system of unmanned vehicles.
Translated title of the contribution | Object Detection and Tracking for Unmanned Vehicles Based on Fusion of Infrared Camera and MMW Radar in Smoke-obscured Environment |
---|---|
Original language | Chinese (Traditional) |
Pages (from-to) | 893-906 |
Number of pages | 14 |
Journal | Binggong Xuebao/Acta Armamentarii |
Volume | 45 |
Issue number | 3 |
DOIs | |
Publication status | Published - 22 Mar 2024 |