PSGAN: A Generative Adversarial Network for Remote Sensing Image Pan-Sharpening

Qingjie Liu, Huanyu Zhou, Qizhi Xu, Xiangyu Liu, Yunhong Wang*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

170 Citations (Scopus)

Abstract

This article addresses the problem of remote sensing image pan-sharpening from the perspective of generative adversarial learning. We propose a novel deep neural network-based method named pansharpening GAN (PSGAN). To the best of our knowledge, this is one of the first attempts at producing high-quality pan-sharpened images with generative adversarial networks (GANs). The PSGAN consists of two components: a generative network (i.e., generator) and a discriminative network (i.e., discriminator). The generator is designed to accept panchromatic (PAN) and multispectral (MS) images as inputs and maps them to the desired high-resolution (HR) MS images, and the discriminator implements the adversarial training strategy for generating higher fidelity pan-sharpened images. In this article, we evaluate several architectures and designs, namely, two-stream input, stacking input, batch normalization layer, and attention mechanism to find the optimal solution for pan-sharpening. Extensive experiments on QuickBird, GaoFen-2, and WorldView-2 satellite images demonstrate that the proposed PSGANs not only are effective in generating high-quality HR MS images and superior to state-of-the-art methods but also generalize well to full-scale images.

Original languageEnglish
Pages (from-to)10227-10242
Number of pages16
JournalIEEE Transactions on Geoscience and Remote Sensing
Volume59
Issue number12
DOIs
Publication statusPublished - 1 Dec 2021

Keywords

  • Convolutional neural network (CNN)
  • deep learning
  • generative adversarial network (GAN)
  • pan-sharpening
  • residual learning

Fingerprint

Dive into the research topics of 'PSGAN: A Generative Adversarial Network for Remote Sensing Image Pan-Sharpening'. Together they form a unique fingerprint.

Cite this