Real-time structure and motion by fusion of inertial and vision data for mobile AR system

Jing Chen*, Yong Tian Wang, Yue Liu, Pinz Axel

*此作品的通讯作者

科研成果: 期刊稿件文章同行评审

1 引用 (Scopus)

摘要

The performance of adding additional inertial data to improve the accuracy and robustness of visual tracking is investigated. For this real-time structure and motion algorithm, fusion is based on Kalman filter framework while using an extended Kalman filter to fuse the inertial and vision data, and a bank of Kalman filters to estimate the sparse 3D structure of the real scene. A simple, known target is used for the initial pose estimation. Motion and structure estimation filters can work alternately to recover the sensor motion, scene structure and other parameters. Real image sequences are utilized to test the capability of this algorithm. Experimental results show that the proper use of an additional inertial information can not only effectively improve the accuracy of the pose and structure estimation, but also handle occlusion problem.

源语言英语
页(从-至)431-436
页数6
期刊Journal of Beijing Institute of Technology (English Edition)
15
4
出版状态已出版 - 12月 2006

指纹

探究 'Real-time structure and motion by fusion of inertial and vision data for mobile AR system' 的科研主题。它们共同构成独一无二的指纹。

引用此