Automatic cervical tumors segmentation in PET/MRI by parallel encoder U-net

Shuai Liu, Zheng Tan, Tan Gong, Xiaoying Tang, Hongzan Sun, Fei Shang*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Background: Automatic segmentation of cervical tumors is important in quantitative analysis and radiotherapy planning. Methods: A parallel encoder U-Net (PEU-Net) integrating the multi-modality information of PET/MRI was proposed to segment cervical tumor, which consisted of two parallel encoders with the same structure for PET and MR images. The features of the two modalities were extracted separately and fused at each layer of the decoder. Res2Net module on skip connection aggregated the features of various scales and refined the segmentation performance. PET/MRI images of 165 patients with cervical cancer were included in this study. U-Net, TransUNet, and nnU-Net with single or multi-modality (PET or/and T2WI) input were used for comparison. The Dice similarity coefficient (DSC) with volume data, DSC and the 95th percentile of Hausdorff distance (HD95) with tumor slices were calculated to evaluate the performance. Results: The proposed PEU-Net exhibited the best performance (DSC3d: 0.726 ± 0.204, HD95: 4.603 ± 4.579 mm), DSC2d (0.871 ± 0.113) was comparable to the best result of TransUNet with PET/MRI (0.873 ± 0.125). Conclusions: The networks with multi-modality input outperformed those with single-modality images as input. The results showed that the proposed PEU-Net could use multi-modality information more effectively through the redesigned structure and achieved competitive performance.

Original languageEnglish
Article number95
JournalRadiation Oncology
Volume20
Issue number1
DOIs
Publication statusPublished - Dec 2025
Externally publishedYes

Keywords

  • Deep learning
  • PET/MRI
  • Tumors segmentation

Fingerprint

Dive into the research topics of 'Automatic cervical tumors segmentation in PET/MRI by parallel encoder U-net'. Together they form a unique fingerprint.

Cite this