TY - JOUR
T1 - Degradation-estimated hybrid unfolding transformer network for efficient hyperspectral image reconstruction
AU - Fang, Zhen
AU - Ma, Xu
AU - Arce, Gonzalo R.
N1 - Publisher Copyright:
© 2025 Elsevier Ltd
PY - 2025/12
Y1 - 2025/12
N2 - Deep unfolding (DU) methods constitute a promising class of neural networks that have shown good performance in image reconstruction and denoising for compressive hyperspectral imaging. Existing DU methods, however, have difficulties in balancing inference speed and reconstruction accuracy due to their limitations in the information transmission capacity, degradation estimation method, denoiser structure and training strategy. This paper proposes a novel DU architecture, dubbed degradation-estimated hybrid unfolding transformer network (DHUTNet), to improve the reconstruction quality and inference speed simultaneously. Firstly, a novel degradation-estimated hybrid unfolding framework (DHUF) is proposed to improve the generalization ability and training efficiency. The recovery module in DHUF includes a linear projection branch and a deep recovery branch. Moreover, the deep recovery branch estimates a pixel-specific degradation pattern from the projection error in each stage. Secondly, a dense-prior fusion module is designed to connect different stages of DHUF to further improve the reconstruction quality with less computational burden. Thirdly, a U-shape hybrid-attention transformer is introduced in the denoising module of DHUTNet to efficiently model various correlations of hyperspectral images. Finally, a novel progressive inherited training strategy for the larger-version DHUTNet is proposed, which significantly reduces the training time and improves the reconstruction quality. Experimental results demonstrate that the proposed DHUTNet outperforms existing methods, and can improve the inference speed and training efficiency up to 5-fold and 21-fold, but still achieve equivalent reconstruction quality.
AB - Deep unfolding (DU) methods constitute a promising class of neural networks that have shown good performance in image reconstruction and denoising for compressive hyperspectral imaging. Existing DU methods, however, have difficulties in balancing inference speed and reconstruction accuracy due to their limitations in the information transmission capacity, degradation estimation method, denoiser structure and training strategy. This paper proposes a novel DU architecture, dubbed degradation-estimated hybrid unfolding transformer network (DHUTNet), to improve the reconstruction quality and inference speed simultaneously. Firstly, a novel degradation-estimated hybrid unfolding framework (DHUF) is proposed to improve the generalization ability and training efficiency. The recovery module in DHUF includes a linear projection branch and a deep recovery branch. Moreover, the deep recovery branch estimates a pixel-specific degradation pattern from the projection error in each stage. Secondly, a dense-prior fusion module is designed to connect different stages of DHUF to further improve the reconstruction quality with less computational burden. Thirdly, a U-shape hybrid-attention transformer is introduced in the denoising module of DHUTNet to efficiently model various correlations of hyperspectral images. Finally, a novel progressive inherited training strategy for the larger-version DHUTNet is proposed, which significantly reduces the training time and improves the reconstruction quality. Experimental results demonstrate that the proposed DHUTNet outperforms existing methods, and can improve the inference speed and training efficiency up to 5-fold and 21-fold, but still achieve equivalent reconstruction quality.
KW - Computational imaging
KW - Deep unfolding
KW - Hyperspectral imaging
KW - Image reconstruction
KW - Machine learning
UR - https://www.scopus.com/pages/publications/105009278336
U2 - 10.1016/j.optlastec.2025.113338
DO - 10.1016/j.optlastec.2025.113338
M3 - Article
AN - SCOPUS:105009278336
SN - 0030-3992
VL - 192
JO - Optics and Laser Technology
JF - Optics and Laser Technology
M1 - 113338
ER -