Outdoor guide system based on the mobile augmented reality technology

Yunchao Zhang, Jing Chen*, Yongtian Wang, Zhiwei Xu

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)

Abstract

This paper proposes an outdoor guide system using vision-based augmented reality (AR) on mobile devices. Augmented reality provides a virtual-real fusion display interface for outdoor guide. Vision-based methods are more accurate than GPS or other hardware-based methods. However, vision-based methods require more resources and relatively strong computing power of mobile devices. A C/S framework for vision based augmented reality system is introduced in this paper. In a server, a vocabulary tree is used for location recognition. In a mobile device, BRISK feature is combined with optical flow methods to track the offline keyframe. The system is tested on UKbench datasets and in real environment. Experimental results show that the proposed vision-based augmented reality system works well and yields relatively high recognition rate and that the mobile device achieves realtime recognition performance.

Original languageEnglish
Pages (from-to)301-307
Number of pages7
JournalHigh Technology Letters
Volume20
Issue number3
DOIs
Publication statusPublished - 1 Sept 2014

Keywords

  • Location recognition
  • Mobile augmented reality
  • Optical flow
  • Tracking and registration
  • Vocabulary tree

Fingerprint

Dive into the research topics of 'Outdoor guide system based on the mobile augmented reality technology'. Together they form a unique fingerprint.

Cite this