Adaptive Knowledge Distillation for High-Quality Unsupervised MRI Reconstruction with Model-Driven Priors

Zhengliang Wu, Xuesong Li*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Magnetic Resonance Imaging (MRI) reconstruction has made significant progress with the introduction of Deep Learning (DL) technology combined with Compressed Sensing (CS). However, most existing methods require large fully sampled training datasets to supervise the training process, which may be unavailable in many applications. Current unsupervised models also show limitations in performance or speed and may face unaligned distributions during testing. This paper proposes an unsupervised method to train competitive reconstruction models that can generate high-quality samples in an end-to-end style. Firstly teacher models are trained by filling the re-undersampled images and compared with the undersampled images in a self-supervised manner. The teacher models are then distilled to train another cascade model that can leverage the entire undersampled k-space during its training and testing. Additionally, we propose an adaptive distillation method to re-weight the samples based on the variance of teachers, which represents the confidence of the reconstruction results, to improve the quality of distillation. Experimental results on multiple datasets demonstrate that our method significantly accelerates the inference process while preserving or even improving the performance compared to the teacher model. In our tests, the distilled models show 5%-10% improvements in PSNR and SSIM compared with no distillation and are 10 times faster than the teacher.

Original languageEnglish
Pages (from-to)3571-3582
Number of pages12
JournalIEEE Journal of Biomedical and Health Informatics
Volume28
Issue number6
DOIs
Publication statusPublished - 1 Jun 2024

Keywords

  • Compressed sensing
  • MRI reconstruction
  • knowledge distillation
  • unsupervised reconstruction

Fingerprint

Dive into the research topics of 'Adaptive Knowledge Distillation for High-Quality Unsupervised MRI Reconstruction with Model-Driven Priors'. Together they form a unique fingerprint.

Cite this