Pan-Sharpening with a CNN-Based Two Stage Ratio Enhancement Method

Huanyu Zhou, Qingjie Liu*, Qizhi Xu, Yunhong Wang

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

3 Citations (Scopus)

Abstract

We propose a hybrid method combining the deep learning technique and the ratio enhancement (RE) method for pansharpening. The intuition behind is to utilize the deep learning technique to synthesize a panchromatic (PAN) image for the RE method to reduce the spectral distortion while keeping the spatial details. The method consists of two stages. First, the CNN synthesizer is optimized to generate the downsampled PAN image to guarantee the network have a good initialization. Second, CNN is integrated into the RE method and supervised by the ground truth multi-spectral (MS) to produce an ideal synthesized PAN for the RE method. We conduct experiments on various datasets and compare with widely used methods to demonstrate the superiority of the proposed method.

Original languageEnglish
Title of host publication2020 IEEE International Geoscience and Remote Sensing Symposium, IGARSS 2020 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages216-219
Number of pages4
ISBN (Electronic)9781728163741
DOIs
Publication statusPublished - 26 Sept 2020
Externally publishedYes
Event2020 IEEE International Geoscience and Remote Sensing Symposium, IGARSS 2020 - Virtual, Waikoloa, United States
Duration: 26 Sept 20202 Oct 2020

Publication series

NameInternational Geoscience and Remote Sensing Symposium (IGARSS)

Conference

Conference2020 IEEE International Geoscience and Remote Sensing Symposium, IGARSS 2020
Country/TerritoryUnited States
CityVirtual, Waikoloa
Period26/09/202/10/20

Keywords

  • Convolutional Neural Network (CNN)
  • Image fusion
  • deep learning
  • pan-sharpening

Fingerprint

Dive into the research topics of 'Pan-Sharpening with a CNN-Based Two Stage Ratio Enhancement Method'. Together they form a unique fingerprint.

Cite this