Sensor fusion for vision-based indoor head pose tracking

Bin Luo*, Yongtian Wang, Yue Liu

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

3 Citations (Scopus)

Abstract

Accurate head pose tracking is a key issue for indoor augmented reality systems. This paper proposes a novel approach to track head pose of indoor users using sensor fusion. The proposed approach utilizes a track-to-track fusion framework composed of extended Kalman filters and fusion filter to fuse the poses from the two complementary tracking modes of inside-out tracking (IOT) and outside-in tracking (OIT). A vision-based head tracker is constructed to verify our approach. Primary experimental results show that the tracker is capable of achieving more accurate and stable pose than the single tracking mode of IOT or OIT, which validates the usefulness of the proposed sensor fusion approach.

Original languageEnglish
Title of host publicationProceedings of the 5th International Conference on Image and Graphics, ICIG 2009
PublisherIEEE Computer Society
Pages677-682
Number of pages6
ISBN (Print)9780769538839
DOIs
Publication statusPublished - 2009
Event5th International Conference on Image and Graphics, ICIG 2009 - Xi'an, Shanxi, China
Duration: 20 Sept 200923 Sept 2009

Publication series

NameProceedings of the 5th International Conference on Image and Graphics, ICIG 2009

Conference

Conference5th International Conference on Image and Graphics, ICIG 2009
Country/TerritoryChina
CityXi'an, Shanxi
Period20/09/0923/09/09

Keywords

  • Augmented reality
  • Extended Kalman filter
  • Head pose tracking
  • Sensor fusion

Fingerprint

Dive into the research topics of 'Sensor fusion for vision-based indoor head pose tracking'. Together they form a unique fingerprint.

Cite this