TY - JOUR
T1 - Unsupervised Pansharpening Method Using Residual Network With Spatial Texture Attention
AU - Xiong, Zhangxi
AU - Liu, Na
AU - Wang, Nan
AU - Sun, Zhiwei
AU - Li, Wei
N1 - Publisher Copyright:
© 1980-2012 IEEE.
PY - 2023
Y1 - 2023
N2 - Recently, deep learning has become one of the most popular tools for pansharpening; many relevant methods have been investigated and reflected great performance. However, a nonnegligible problem is the absence of ground truth (GT). A common solution is using degraded images as training input and the original images are used as GT. The learned mapping between low resolution (LR) and high resolution (HR) is simulated, which is not real, which may cause spectral distortion or insufficient spatial texture enhancement of fused images. To address the drawback, a novel unsupervised attention pansharpening net (UAP-Net) is proposed. The proposed UAP-Net mainly contains two major components: 1) the deep residual network (DRN) and 2) spatial texture attention block (STAB). The DRN aims to extract spectral features and spatial features from LR multispectral (LRMS) and panchromatic (PAN), and to fuse those features to make them more representative. The designed STAB adopts the high-frequency component of the corresponding input PAN as the weight to enhance the spatial details of the residual block output features. Moreover, a new loss function including two spatial losses and two spectral losses is established. The losses are calculated in the spatial and frequency domains, respectively. Experiments on Gaofen-2 and Worldview-2 remote sensing data demonstrate that the proposed UAP-Net could fuse PAN and LRMS images effectively without the help of HR multispectral (HRMS). The proposed framework is fully general and can be used for many multisource remote sensing image fusion models and can achieve optimal performance in terms of both the subjective visual effect and quantitative evaluation.
AB - Recently, deep learning has become one of the most popular tools for pansharpening; many relevant methods have been investigated and reflected great performance. However, a nonnegligible problem is the absence of ground truth (GT). A common solution is using degraded images as training input and the original images are used as GT. The learned mapping between low resolution (LR) and high resolution (HR) is simulated, which is not real, which may cause spectral distortion or insufficient spatial texture enhancement of fused images. To address the drawback, a novel unsupervised attention pansharpening net (UAP-Net) is proposed. The proposed UAP-Net mainly contains two major components: 1) the deep residual network (DRN) and 2) spatial texture attention block (STAB). The DRN aims to extract spectral features and spatial features from LR multispectral (LRMS) and panchromatic (PAN), and to fuse those features to make them more representative. The designed STAB adopts the high-frequency component of the corresponding input PAN as the weight to enhance the spatial details of the residual block output features. Moreover, a new loss function including two spatial losses and two spectral losses is established. The losses are calculated in the spatial and frequency domains, respectively. Experiments on Gaofen-2 and Worldview-2 remote sensing data demonstrate that the proposed UAP-Net could fuse PAN and LRMS images effectively without the help of HR multispectral (HRMS). The proposed framework is fully general and can be used for many multisource remote sensing image fusion models and can achieve optimal performance in terms of both the subjective visual effect and quantitative evaluation.
KW - Multispectral (MS)
KW - panchromatic (PAN)
KW - pansharpening
KW - spatial loss function
KW - spatial texture attention block (STAB)
KW - spectral loss function
UR - https://www.scopus.com/pages/publications/85153349268
U2 - 10.1109/TGRS.2023.3267056
DO - 10.1109/TGRS.2023.3267056
M3 - Article
AN - SCOPUS:85153349268
SN - 0196-2892
VL - 61
JO - IEEE Transactions on Geoscience and Remote Sensing
JF - IEEE Transactions on Geoscience and Remote Sensing
M1 - 5402112
ER -