Semi-Supervised CT Lesion Segmentation Using Uncertainty-Based Data Pairing and SwapMix

  • Pengchong Qiao
  • , Han Li
  • , Guoli Song
  • , Hu Han
  • , Zhiqiang Gao
  • , Yonghong Tian
  • , Yongsheng Liang
  • , Xi Li
  • , S. Kevin Zhou*
  • , Jie Chen*
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

14 Citations (Scopus)

Abstract

Semi-supervised learning (SSL) methods show their powerful performance to deal with the issue of data shortage in the field of medical image segmentation. However, existing SSL methods still suffer from the problem of unreliable predictions on unannotated data due to the lack of manual annotations for them. In this paper, we propose an unreliability-diluted consistency training (UDiCT) mechanism to dilute the unreliability in SSL by assembling reliable annotated data into unreliable unannotated data. Specifically, we first propose an uncertainty-based data pairing module to pair annotated data with unannotated data based on a complementary uncertainty pairing rule, which avoids two hard samples being paired off. Secondly, we develop SwapMix, a mixed sample data augmentation method, to integrate annotated data into unannotated data for training our model in a low-unreliability manner. Finally, UDiCT is trained by minimizing a supervised loss and an unreliability-diluted consistency loss, which makes our model robust to diverse backgrounds. Extensive experiments on three chest CT datasets show the effectiveness of our method for semi-supervised CT lesion segmentation.

Original languageEnglish
Pages (from-to)1546-1562
Number of pages17
JournalIEEE Transactions on Medical Imaging
Volume42
Issue number5
DOIs
Publication statusPublished - 1 May 2023
Externally publishedYes

Keywords

  • Semi-supervised learning
  • lesion segmentation
  • unreliable pseudo labels

Fingerprint

Dive into the research topics of 'Semi-Supervised CT Lesion Segmentation Using Uncertainty-Based Data Pairing and SwapMix'. Together they form a unique fingerprint.

Cite this