TY - GEN
T1 - Learning Spectral-wise Correlation for Spectral Super-Resolution
T2 - 31st ACM International Conference on Multimedia, MM 2023
AU - Wang, Hongyuan
AU - Wang, Lizhi
AU - Chen, Chang
AU - Hu, Xue
AU - Song, Fenglong
AU - Huang, Hua
N1 - Publisher Copyright:
© 2023 ACM.
PY - 2023/10/26
Y1 - 2023/10/26
N2 - Hyperspectral images consist of multiple spectral channels, and the task of spectral super-resolution is to reconstruct hyperspectral images from 3-channel RGB images, where modeling spectral-wise correlation is of great importance. Based on the analysis of the physical process of this task, we distinguish the spectral-wise correlation into two aspects: similarity and particularity. The Existing Transformer model cannot accurately capture spectral-wise similarity due to the inappropriate spectral-wise fully connected linear mapping acting on input spectral feature maps, which results in spectral feature maps mixing. Moreover, the token normalization operation in the existing Transformer model also results in its inability to capture spectral-wise particularity and thus fails to extract key spectral feature maps. To address these issues, we propose a novel Hybrid Spectral-wise Attention Transformer (HySAT). The key module of HySAT is Plausible Spectral-wise self-Attention (PSA), which can simultaneously model spectral-wise similarity and particularity. Specifically, we propose a Token Independent Mapping (TIM) mechanism to reasonably model spectral-wise similarity, where a linear mapping shared by spectral feature maps is applied on input spectral feature maps. Moreover, we propose a Spectral-wise Re-Calibration (SRC) mechanism to model spectral-wise particularity and effectively capture significant spectral feature maps. Experimental results show that our method achieves state-of-the-art performance in the field of spectral super-resolution with the lowest error and computational costs.
AB - Hyperspectral images consist of multiple spectral channels, and the task of spectral super-resolution is to reconstruct hyperspectral images from 3-channel RGB images, where modeling spectral-wise correlation is of great importance. Based on the analysis of the physical process of this task, we distinguish the spectral-wise correlation into two aspects: similarity and particularity. The Existing Transformer model cannot accurately capture spectral-wise similarity due to the inappropriate spectral-wise fully connected linear mapping acting on input spectral feature maps, which results in spectral feature maps mixing. Moreover, the token normalization operation in the existing Transformer model also results in its inability to capture spectral-wise particularity and thus fails to extract key spectral feature maps. To address these issues, we propose a novel Hybrid Spectral-wise Attention Transformer (HySAT). The key module of HySAT is Plausible Spectral-wise self-Attention (PSA), which can simultaneously model spectral-wise similarity and particularity. Specifically, we propose a Token Independent Mapping (TIM) mechanism to reasonably model spectral-wise similarity, where a linear mapping shared by spectral feature maps is applied on input spectral feature maps. Moreover, we propose a Spectral-wise Re-Calibration (SRC) mechanism to model spectral-wise particularity and effectively capture significant spectral feature maps. Experimental results show that our method achieves state-of-the-art performance in the field of spectral super-resolution with the lowest error and computational costs.
KW - spectral super-resolution
KW - spectral-wise particularity
KW - spectral-wise similarity
KW - transformer
UR - http://www.scopus.com/inward/record.url?scp=85179555680&partnerID=8YFLogxK
U2 - 10.1145/3581783.3611760
DO - 10.1145/3581783.3611760
M3 - Conference contribution
AN - SCOPUS:85179555680
T3 - MM 2023 - Proceedings of the 31st ACM International Conference on Multimedia
SP - 7676
EP - 7685
BT - MM 2023 - Proceedings of the 31st ACM International Conference on Multimedia
PB - Association for Computing Machinery, Inc
Y2 - 29 October 2023 through 3 November 2023
ER -