Real-time structure and motion by fusion of inertial and vision data for mobile AR system

Jing Chen*, Yong Tian Wang, Yue Liu, Pinz Axel

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)

Abstract

The performance of adding additional inertial data to improve the accuracy and robustness of visual tracking is investigated. For this real-time structure and motion algorithm, fusion is based on Kalman filter framework while using an extended Kalman filter to fuse the inertial and vision data, and a bank of Kalman filters to estimate the sparse 3D structure of the real scene. A simple, known target is used for the initial pose estimation. Motion and structure estimation filters can work alternately to recover the sensor motion, scene structure and other parameters. Real image sequences are utilized to test the capability of this algorithm. Experimental results show that the proper use of an additional inertial information can not only effectively improve the accuracy of the pose and structure estimation, but also handle occlusion problem.

Original languageEnglish
Pages (from-to)431-436
Number of pages6
JournalJournal of Beijing Institute of Technology (English Edition)
Volume15
Issue number4
Publication statusPublished - Dec 2006

Keywords

  • Augmented reality
  • Hybrid tracking
  • Structure and motion

Fingerprint

Dive into the research topics of 'Real-time structure and motion by fusion of inertial and vision data for mobile AR system'. Together they form a unique fingerprint.

Cite this