TY - JOUR
T1 - Intraoperative Tumor-Augmented Imaging Method via a Graph-Optimized Fusion Framework of Endoluminal and External Endoscopic Vision
AU - Cao, Sifan
AU - Fan, Jingfan
AU - Yang, Yun
AU - Lu, Dawei
AU - Zheng, Zhao
AU - Jiang, Xuexin
AU - Zheng, Xiaohao
AU - Zhang, Yixi
AU - Fu, Tianyu
AU - Xiao, Deqiang
AU - Song, Hong
AU - Ai, Danni
AU - Wang, Yuanyuan
AU - Yang, Jian
N1 - Publisher Copyright:
© 1963-2012 IEEE.
PY - 2025
Y1 - 2025
N2 - Accurate intraoperative localization of luminal tumors is crucial for effective resection. However, the limited field of view of endoscopes hinders lesion visualization from the external operative perspective. Furthermore, existing endoscopic fusion navigation techniques struggle to align endoscopic images with preoperative CT scans after deformation of flexible organs. To address this, we propose a novel tumor-augmented imaging method that reconstructs tumor morphology from intracavitary endoscopy (IE) images and, by leveraging spatial relationships among sensors, fuses the lesion onto multiple nonfixed laparoscopic/scene cameras (L/SCs). To mitigate multisource uncertainties in calibrating external visual coordinate systems, we proposed a graph-optimization-based multiloop constraint calibration framework. The framework is first modeled as a directed graph (DG), followed by a parameter-sharing strategy leveraging a covariance-weighted approach to quantitatively characterize sensor uncertainty. Building on this, a reprojection error function is constructed for multiple nonfixed camera systems to globally optimize the hand–eye calibration matrix, enabling accurate fusion of tumors reconstructed from absolute tracking data into their corresponding positions in the external views. Experiments on self-built and public calibration datasets demonstrate state-of-the-art accuracy and strong robustness to sensor noise. Ex vivo studies validate the reliability of tumor fusion onto external views, providing a novel fusion-guided surgery navigation method for intraoperative tumor localization.
AB - Accurate intraoperative localization of luminal tumors is crucial for effective resection. However, the limited field of view of endoscopes hinders lesion visualization from the external operative perspective. Furthermore, existing endoscopic fusion navigation techniques struggle to align endoscopic images with preoperative CT scans after deformation of flexible organs. To address this, we propose a novel tumor-augmented imaging method that reconstructs tumor morphology from intracavitary endoscopy (IE) images and, by leveraging spatial relationships among sensors, fuses the lesion onto multiple nonfixed laparoscopic/scene cameras (L/SCs). To mitigate multisource uncertainties in calibrating external visual coordinate systems, we proposed a graph-optimization-based multiloop constraint calibration framework. The framework is first modeled as a directed graph (DG), followed by a parameter-sharing strategy leveraging a covariance-weighted approach to quantitatively characterize sensor uncertainty. Building on this, a reprojection error function is constructed for multiple nonfixed camera systems to globally optimize the hand–eye calibration matrix, enabling accurate fusion of tumors reconstructed from absolute tracking data into their corresponding positions in the external views. Experiments on self-built and public calibration datasets demonstrate state-of-the-art accuracy and strong robustness to sensor noise. Ex vivo studies validate the reliability of tumor fusion onto external views, providing a novel fusion-guided surgery navigation method for intraoperative tumor localization.
KW - Augmented imaging
KW - hand–eye calibration
KW - multiview fusion
KW - optimization
UR - https://www.scopus.com/pages/publications/105022831525
U2 - 10.1109/TIM.2025.3632455
DO - 10.1109/TIM.2025.3632455
M3 - Article
AN - SCOPUS:105022831525
SN - 0018-9456
VL - 74
JO - IEEE Transactions on Instrumentation and Measurement
JF - IEEE Transactions on Instrumentation and Measurement
M1 - 4020113
ER -