Hybrid tracking for outdoor augmented reality system

Jing Chen*, Yongtian Wang, Yue Liu, Wei Liu, Junwei Guo, Jingdun Lin

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

10 Citations (Scopus)

Abstract

Till now there is not a sensor that can provide a complete solution for tracking in outdoor augmented reality system. To improve the robustness and accuracy of real-time visual tracking, we present a sensor fusion algorithm by combining inertial sensors and a CMOS camera, making it suitable for unknown environment. The fusion algorithm makes use of an extended Kalman filtering to fuse inertial and vision data to estimate the trajectory of the camera. Meanwhile, the inherent error drift problem of the inertial sensor is addressed by using the vision information. The method of single-constraint-at-a-time (SCAAT) is also introduced to assimilate the sequential observations. Experimental results show that the proper use of additional inertial information can effectively enhance the robustness and accuracy of visual tracking.

Original languageEnglish
Pages (from-to)204-209
Number of pages6
JournalJisuanji Fuzhu Sheji Yu Tuxingxue Xuebao/Journal of Computer-Aided Design and Computer Graphics
Volume22
Issue number2
Publication statusPublished - Feb 2010

Keywords

  • Camera tracking
  • Outdoor augmented reality
  • Sensor fusion
  • Single constraint at a time

Fingerprint

Dive into the research topics of 'Hybrid tracking for outdoor augmented reality system'. Together they form a unique fingerprint.

Cite this