Unsupervised Hyperspectral Pansharpening by Ratio Estimation and Residual Attention Network

Jinyan Nie, Qizhi Xu*, Junjun Pan

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

13 Citations (Scopus)

Abstract

Most deep learning-based hyperspectral pansharpening methods use the hyperspectral images (HSIs) as the ground truth. Training samples are usually obtained by blurring and downsampling the panchromatic image and HSI. However, the blurring and downsampling operation lose much spatial and spectral information. As a result, the model parameters trained by these reduced-resolution samples are unsuitable for fusing full-resolution images. To tackle this problem, we propose an unsupervised hyperspectral pansharpening method via ratio estimation (RE) and residual attention network (RE-RANet). The spatial and spectral information of the fused image are derived from the original panchromatic and HSI rather than reduced-resolution images. At first, we generate the initial ratio image using the ratio enhancement method. The initial ratio image is fine-tuned by the residual attention network (RANet) to generate a multichannel ratio image. Then, we inject the multichannel ratio image that contains spatial detail information into the HSI. Finally, the generated hyperspectral image is constrained by the spatial constraint loss and the spectral constraint loss. Experiments on the EO-1 and Chikusei datasets verify the effectiveness of the proposed method. Compared with other state-of-the-art approaches, our method performs well in qualitative visual effects and quantitative evaluation indicators.

Original languageEnglish
JournalIEEE Geoscience and Remote Sensing Letters
Volume19
DOIs
Publication statusPublished - 2022

Keywords

  • Deep learning
  • hyperspectral pansharpening
  • ratio estimation (RE)
  • residual attention network (RANet)

Fingerprint

Dive into the research topics of 'Unsupervised Hyperspectral Pansharpening by Ratio Estimation and Residual Attention Network'. Together they form a unique fingerprint.

Cite this