Adaptive Knowledge Distillation for High-Quality Unsupervised MRI Reconstruction with Model-Driven Priors

Zhengliang Wu, Xuesong Li*

*此作品的通讯作者

科研成果: 期刊稿件文章同行评审

摘要

Magnetic Resonance Imaging (MRI) reconstruction has made significant progress with the introduction of Deep Learning (DL) technology combined with Compressed Sensing (CS). However, most existing methods require large fully sampled training datasets to supervise the training process, which may be unavailable in many applications. Current unsupervised models also show limitations in performance or speed and may face unaligned distributions during testing. This paper proposes an unsupervised method to train competitive reconstruction models that can generate high-quality samples in an end-to-end style. Firstly teacher models are trained by filling the re-undersampled images and compared with the undersampled images in a self-supervised manner. The teacher models are then distilled to train another cascade model that can leverage the entire undersampled k-space during its training and testing. Additionally, we propose an adaptive distillation method to re-weight the samples based on the variance of teachers, which represents the confidence of the reconstruction results, to improve the quality of distillation. Experimental results on multiple datasets demonstrate that our method significantly accelerates the inference process while preserving or even improving the performance compared to the teacher model. In our tests, the distilled models show 5%-10% improvements in PSNR and SSIM compared with no distillation and are 10 times faster than the teacher.

源语言英语
页(从-至)3571-3582
页数12
期刊IEEE Journal of Biomedical and Health Informatics
28
6
DOI
出版状态已出版 - 1 6月 2024

指纹

探究 'Adaptive Knowledge Distillation for High-Quality Unsupervised MRI Reconstruction with Model-Driven Priors' 的科研主题。它们共同构成独一无二的指纹。

引用此