TY - JOUR
T1 - Homography-based robust pose compensation and fusion imaging for augmented reality based endoscopic navigation system
AU - Li, Wenjie
AU - Fan, Jingfan
AU - Li, Shaowen
AU - Tian, Zhaorui
AU - Ai, Danni
AU - Song, Hong
AU - Yang, Jian
N1 - Publisher Copyright:
© 2021 Elsevier Ltd
PY - 2021/11
Y1 - 2021/11
N2 - Background: Augmented reality (AR) based fusion imaging in endoscopic surgeries rely on the quality of image-to-patient registration and camera calibration, and these two offline steps are usually performed independently to get the target transformation separately. The optimal solution can be obtained under independent conditions but may not be globally optimal. All residual errors will be accumulated and eventually lead to inaccurate AR fusion. Methods: After a careful analysis of the principle of AR imaging, a robust online calibration framework was proposed for an endoscopic camera to enable accurate AR fusion. A 2D checkerboard-based homography estimation algorithm was proposed to estimate the local pose of the endoscopic camera, and the least square method was used to calculate the compensation matrix in combination with the optical tracking system. Results: In comparison with conventional methods, the proposed compensation method improved the performance of AR fusion, which reduced physical error by up to 82%, reduced pixel error by up to 83%, and improved target coverage by up to 6%. Experimental results of simulating mechanical noise revealed that the proposed compensation method effectively corrected the fusion errors caused by the rotation of the endoscopic tube without recalibrating the camera. Furthermore, the simulation results revealed the robustness of the proposed compensation method to noises. Conclusions: Overall, the experiment results proved the effectiveness of the proposed compensation method and online calibration framework, and revealed a considerable potential in clinical practice.
AB - Background: Augmented reality (AR) based fusion imaging in endoscopic surgeries rely on the quality of image-to-patient registration and camera calibration, and these two offline steps are usually performed independently to get the target transformation separately. The optimal solution can be obtained under independent conditions but may not be globally optimal. All residual errors will be accumulated and eventually lead to inaccurate AR fusion. Methods: After a careful analysis of the principle of AR imaging, a robust online calibration framework was proposed for an endoscopic camera to enable accurate AR fusion. A 2D checkerboard-based homography estimation algorithm was proposed to estimate the local pose of the endoscopic camera, and the least square method was used to calculate the compensation matrix in combination with the optical tracking system. Results: In comparison with conventional methods, the proposed compensation method improved the performance of AR fusion, which reduced physical error by up to 82%, reduced pixel error by up to 83%, and improved target coverage by up to 6%. Experimental results of simulating mechanical noise revealed that the proposed compensation method effectively corrected the fusion errors caused by the rotation of the endoscopic tube without recalibrating the camera. Furthermore, the simulation results revealed the robustness of the proposed compensation method to noises. Conclusions: Overall, the experiment results proved the effectiveness of the proposed compensation method and online calibration framework, and revealed a considerable potential in clinical practice.
KW - Augmented reality
KW - Camera pose compensation
KW - Endoscope calibration
KW - Image-to-patient registration
KW - Online calibration
UR - http://www.scopus.com/inward/record.url?scp=85116564136&partnerID=8YFLogxK
U2 - 10.1016/j.compbiomed.2021.104864
DO - 10.1016/j.compbiomed.2021.104864
M3 - Article
C2 - 34634638
AN - SCOPUS:85116564136
SN - 0010-4825
VL - 138
JO - Computers in Biology and Medicine
JF - Computers in Biology and Medicine
M1 - 104864
ER -