Sensor fusion for vision-based indoor head pose tracking

Bin Luo*, Yongtian Wang, Yue Liu

*此作品的通讯作者

科研成果: 书/报告/会议事项章节会议稿件同行评审

3 引用 (Scopus)

摘要

Accurate head pose tracking is a key issue for indoor augmented reality systems. This paper proposes a novel approach to track head pose of indoor users using sensor fusion. The proposed approach utilizes a track-to-track fusion framework composed of extended Kalman filters and fusion filter to fuse the poses from the two complementary tracking modes of inside-out tracking (IOT) and outside-in tracking (OIT). A vision-based head tracker is constructed to verify our approach. Primary experimental results show that the tracker is capable of achieving more accurate and stable pose than the single tracking mode of IOT or OIT, which validates the usefulness of the proposed sensor fusion approach.

源语言英语
主期刊名Proceedings of the 5th International Conference on Image and Graphics, ICIG 2009
出版商IEEE Computer Society
677-682
页数6
ISBN(印刷版)9780769538839
DOI
出版状态已出版 - 2009
活动5th International Conference on Image and Graphics, ICIG 2009 - Xi'an, Shanxi, 中国
期限: 20 9月 200923 9月 2009

出版系列

姓名Proceedings of the 5th International Conference on Image and Graphics, ICIG 2009

会议

会议5th International Conference on Image and Graphics, ICIG 2009
国家/地区中国
Xi'an, Shanxi
时期20/09/0923/09/09

指纹

探究 'Sensor fusion for vision-based indoor head pose tracking' 的科研主题。它们共同构成独一无二的指纹。

引用此