Hybrid map-based navigation method for unmanned ground vehicle in urban scenario

Yuwen Hu, Jianwei Gong*, Yan Jiang, Lu Liu, Guangming Xiong, Huiyan Chen

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

21 Citations (Scopus)

Abstract

To reduce the data size of metric map and map matching computational cost in unmanned ground vehicle self-driving navigation in urban scenarios, a metric-topological hybrid map navigation system is proposed in this paper. According to the different positioning accuracy requirements, urban areas are divided into strong constraint (SC) areas, such as roads with lanes, and loose constraint (LC) areas, such as intersections and open areas. As direction of the self-driving vehicle is provided by traffic lanes and global waypoints in the road network, a simple topological map is fit for the navigation in the SC areas. While in the LC areas, the navigation of the self-driving vehicle mainly relies on the positioning information. Simultaneous localization and mapping technology is used to provide a detailed metric map in the LC areas, and a window constraint Markov localization algorithm is introduced to achieve accurate position using laser scanner. Furthermore, the real-time performance of the Markov algorithm is enhanced by using a constraint window to restrict the size of the state space. By registering the metric maps into the road network, a hybrid map of the urban scenario can be constructed. Real unmanned vehicle mapping and navigation tests demonstrated the capabilities of the proposed method.

Original languageEnglish
Pages (from-to)3662-3680
Number of pages19
JournalRemote Sensing
Volume5
Issue number8
DOIs
Publication statusPublished - 2013

Keywords

  • Hybrid map
  • Laser scanner
  • Markov localization
  • Metric map
  • Simultaneous localization and mapping
  • Topological map
  • Unmanned ground vehicle

Fingerprint

Dive into the research topics of 'Hybrid map-based navigation method for unmanned ground vehicle in urban scenario'. Together they form a unique fingerprint.

Cite this