TY - JOUR
T1 - Multi-sensor data fusion for optical tracking of head pose
AU - Luo, Bin
AU - Wang, Yong Tian
AU - Liu, Yue
PY - 2010/9
Y1 - 2010/9
N2 - Accurate head pose tracking is a key issue to accomplish precise registration in indoor augmented reality systems. This paper proposes a novel approach based on multi-sensor data fusion to achieve optical tracking of head pose with high accuracy. This approach employs two extended Kalman filters and one fusion filter for multi-sensor environment to fuse the pose data from two complement optical trackers, an inside-out tracking (IOT) with a camera and an outside-in tracking (OIT) with two cameras, respectively. The aim is to reduce the pose errors from the optical tracking sensors. A representative experimental setup is designed to verify the above approach. The experimental results show that, in the static state, the pose errors of IOT and OIT are consistent to the theoretical results obtained using the rules of error covariance matrix propagation from the respective image noises to the final pose errors, and that in the dynamic state, the proposed multi-sensor data fusion approach used with our combined optical tracker can achieve more accurate and more stable outputs of position and orientation compared with using a single IOT or OIT alone.
AB - Accurate head pose tracking is a key issue to accomplish precise registration in indoor augmented reality systems. This paper proposes a novel approach based on multi-sensor data fusion to achieve optical tracking of head pose with high accuracy. This approach employs two extended Kalman filters and one fusion filter for multi-sensor environment to fuse the pose data from two complement optical trackers, an inside-out tracking (IOT) with a camera and an outside-in tracking (OIT) with two cameras, respectively. The aim is to reduce the pose errors from the optical tracking sensors. A representative experimental setup is designed to verify the above approach. The experimental results show that, in the static state, the pose errors of IOT and OIT are consistent to the theoretical results obtained using the rules of error covariance matrix propagation from the respective image noises to the final pose errors, and that in the dynamic state, the proposed multi-sensor data fusion approach used with our combined optical tracker can achieve more accurate and more stable outputs of position and orientation compared with using a single IOT or OIT alone.
KW - Augmented reality
KW - Error covariance matrix
KW - Head pose tracking
KW - Multi-sensor data fusion
UR - http://www.scopus.com/inward/record.url?scp=78049321051&partnerID=8YFLogxK
U2 - 10.3724/SP.J.1004.2010.01239
DO - 10.3724/SP.J.1004.2010.01239
M3 - Article
AN - SCOPUS:78049321051
SN - 0254-4156
VL - 36
SP - 1239
EP - 1249
JO - Zidonghua Xuebao/Acta Automatica Sinica
JF - Zidonghua Xuebao/Acta Automatica Sinica
IS - 9
ER -