Registration and fusion quantification of augmented reality based nasal endoscopic surgery

Yakui Chu, Jian Yang*, Shaodong Ma, Danni Ai, Wenjie Li, Hong Song, Liang Li, Duanduan Chen, Lei Chen, Yongtian Wang

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

49 Citations (Scopus)

Abstract

This paper quantifies the registration and fusion display errors of augmented reality-based nasal endoscopic surgery (ARNES). We comparatively investigated the spatial calibration process for front-end endoscopy and redefined the accuracy level of a calibrated endoscope by using a calibration tool with improved structural reliability. We also studied how registration accuracy was combined with the number and distribution of the deployed fiducial points (FPs) for positioning and the measured registration time. A physically integrated ARNES prototype was customarily configured for performance evaluation in skull base tumor resection surgery with an innovative approach of dynamic endoscopic vision expansion. As advised by surgical experts in otolaryngology, we proposed a hierarchical rendering scheme to properly adapt the fused images with the required visual sensation. By constraining the rendered sight in a known depth and radius, the visual focus of the surgeon can be induced only on the anticipated critical anatomies and vessel structures to avoid misguidance. Furthermore, error analysis was conducted to examine the feasibility of hybrid optical tracking based on point cloud, which was proposed in our previous work as an in-surgery registration solution. Measured results indicated that the error of target registration for ARNES can be reduced to 0.77 ± 0.07 mm. For initial registration, our results suggest that a trade-off for a new minimal time of registration can be reached when the distribution of five FPs is considered. For in-surgery registration, our findings reveal that the intrinsic registration error is a major cause of performance loss. Rigid model and cadaver experiments confirmed that the scenic integration and display fluency of ARNES are smooth, as demonstrated by three clinical trials that surpassed practicality.

Original languageEnglish
Pages (from-to)241-256
Number of pages16
JournalMedical Image Analysis
Volume42
DOIs
Publication statusPublished - Dec 2017

Keywords

  • Augmented reality
  • Endoscope
  • Fusion
  • Image registration
  • Image-guided surgery
  • Optical tracking

Fingerprint

Dive into the research topics of 'Registration and fusion quantification of augmented reality based nasal endoscopic surgery'. Together they form a unique fingerprint.

Cite this