Deep learning method on target echo signal recognition for obscurant penetrating lidar detection in degraded visual environments

Xujia Liang, Zhonghua Huang, Liping Lu, Zhigang Tao, Bing Yang, Yinlin Li*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

7 Citations (Scopus)

Abstract

With the rapid development of autonomous vehicles and mobile robotics, the desire to advance robust light detection and ranging (Lidar) detection methods for real world applications is increasing. However, this task still suffers in degraded visual environments (DVE), including smoke, dust, fog, and rain, as the aerosols lead to false alarm and dysfunction. Therefore, a novel Lidar target echo signal recognition method, based on a multi-distance measurement and deep learning algorithm is presented in this paper; neither the backscatter suppression nor the denoise functions are required. The 2-D spectrogram images are constructed by using the frequency-distance relation derived from the 1-D echo signals of the Lidar sensor individual cell in the course of approaching target. The characteristics of the target echo signal and noise in the spectrogram images are analyzed and determined; thus, the target recognition criterion is established accordingly. A customized deep learning algorithm is subsequently developed to perform the recognition. The simulation and experimental results demonstrate that the proposed method can significantly improve the Lidar detection performance in DVE.

Original languageEnglish
Article number3424
Pages (from-to)1-16
Number of pages16
JournalSensors
Volume20
Issue number12
DOIs
Publication statusPublished - 2 Jun 2020

Keywords

  • 2-D spectrogram image
  • Deep learning
  • Lidar
  • Obscurant penetrating
  • Visual degraded environment (DVE)

Fingerprint

Dive into the research topics of 'Deep learning method on target echo signal recognition for obscurant penetrating lidar detection in degraded visual environments'. Together they form a unique fingerprint.

Cite this