Obstacle recognition for intelligent vehicle based on radar and vision fusion

Zhenhua Pan, Kewei Li, Hongbin Deng*, Yiran Wei

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

4 Citations (Scopus)

Abstract

Intelligent vehicle has elicited a significant amount of interest from both academe and industry. In order to solve the obstacles detection and recognition problems for intelligent vehicle, this paper puts forward a multi-objective detection method, which is used for the recognition of pedestrians and vehicles based on single-line laser radar and monocular vision fusion. Firstly, the laser radar is used to detect and get the location of the obstacles with the clustering algorithm. Secondly, according to the perspective transformation relationship between radar and vision, mapping the depth of the environment information to the image window, and determining the region of interest (ROI) through obstacle coordinates and coordinate mapping model. Afterwards, the histogram of oriented gradients feature is utilized to extract the feature vectors from ROI, the pre-established support vector machine classifier model is employed to identify and judge the type of obstacles (pedestrian, vehicle). Finally, through a variety of environmental conditions for experiment, the proposed method can detect and identify the obstacles effectively and exactly.

Original languageEnglish
Pages (from-to)178-187
Number of pages10
JournalInternational Journal of Robotics and Automation
Volume36
Issue number3
DOIs
Publication statusPublished - 18 Apr 2021

Keywords

  • Laser radar
  • Monocular camera
  • Obstacle recognition
  • Spatial synchronization

Fingerprint

Dive into the research topics of 'Obstacle recognition for intelligent vehicle based on radar and vision fusion'. Together they form a unique fingerprint.

Cite this

Pan, Z., Li, K., Deng, H., & Wei, Y. (2021). Obstacle recognition for intelligent vehicle based on radar and vision fusion. International Journal of Robotics and Automation, 36(3), 178-187. https://doi.org/10.2316/J.2021.206-0478