TY - JOUR
T1 - A low-complexity sensor fusion algorithm based on a fiber-optic gyroscope aided camera pose estimation system
AU - Tan, Zhongwei
AU - Yang, Chuanchuan
AU - Li, Yuliang
AU - Yan, Yan
AU - He, Changhong
AU - Wang, Xinyue
AU - Wang, Ziyu
N1 - Publisher Copyright:
© 2016, Science China Press and Springer-Verlag Berlin Heidelberg.
PY - 2016/4/1
Y1 - 2016/4/1
N2 - Visual tracking, as a popular computer vision technique, has a wide range of applications, such as camera pose estimation. Conventional methods for it are mostly based on vision only, which are complex for image processing due to the use of only one sensor. This paper proposes a novel sensor fusion algorithm fusing the data from the camera and the fiber-optic gyroscope. In this system, the camera acquires images and detects the object directly at the beginning of each tracking stage; while the relative motion between the camera and the object measured by the fiber-optic gyroscope can track the object coordinate so that it can improve the effectiveness of visual tracking. Therefore, the sensor fusion algorithm presented based on the tracking system can overcome the drawbacks of the two sensors and take advantage of the sensor fusion to track the object accurately. In addition, the computational complexity of our proposed algorithm is obviously lower compared with the existing approaches (86% reducing for a 0.5 min visual tracking). Experiment results show that this visual tracking system reduces the tracking error by 6.15% comparing with the conventional vision-only tracking scheme (edge detection), and our proposed sensor fusion algorithm can achieve a long-term tracking with the help of bias drift suppression calibration.
AB - Visual tracking, as a popular computer vision technique, has a wide range of applications, such as camera pose estimation. Conventional methods for it are mostly based on vision only, which are complex for image processing due to the use of only one sensor. This paper proposes a novel sensor fusion algorithm fusing the data from the camera and the fiber-optic gyroscope. In this system, the camera acquires images and detects the object directly at the beginning of each tracking stage; while the relative motion between the camera and the object measured by the fiber-optic gyroscope can track the object coordinate so that it can improve the effectiveness of visual tracking. Therefore, the sensor fusion algorithm presented based on the tracking system can overcome the drawbacks of the two sensors and take advantage of the sensor fusion to track the object accurately. In addition, the computational complexity of our proposed algorithm is obviously lower compared with the existing approaches (86% reducing for a 0.5 min visual tracking). Experiment results show that this visual tracking system reduces the tracking error by 6.15% comparing with the conventional vision-only tracking scheme (edge detection), and our proposed sensor fusion algorithm can achieve a long-term tracking with the help of bias drift suppression calibration.
KW - camera pose estimation
KW - fiber-optic gyroscope
KW - low-complexity
KW - sensor fusion
KW - visual tracking
UR - http://www.scopus.com/inward/record.url?scp=84957601292&partnerID=8YFLogxK
U2 - 10.1007/s11432-015-5516-2
DO - 10.1007/s11432-015-5516-2
M3 - Article
AN - SCOPUS:84957601292
SN - 1674-733X
VL - 59
JO - Science China Information Sciences
JF - Science China Information Sciences
IS - 4
M1 - 042412
ER -