HPN-CR: Heterogeneous Parallel Network for SAR-Optical Data Fusion Cloud Removal

Panzhe Gu, Wenchao Liu, Shuyi Feng, Tianyu Wei, Jue Wang*, He Chen

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Synthetic aperture radar (SAR)-optical data fusion cloud removal is a highly promising cloud removal technology that has attracted considerable attention. In this field, primary researches are based on deep learning, which can be divided into two categories: convolutional neural network (CNN)-based and Transformer-based. In the cases of extensive cloud coverage, CNN-based methods, with their local spatial awareness, effectively capture local structural information in SAR data, preserving clear contours of cloud-removed land covers. However, these methods struggle to capture global land cover information in optical images, often resulting in notable color discrepancies between recovered and cloud-free regions. Conversely, Transformer-based methods, with their global modeling capability and inherent low-pass filtering properties, excel at capturing long-range spatial correlations in optical images, thereby maintaining color consistency across cloud-removal outputs. However, these methods are less effective at capturing the fine structural details in SAR data, which can lead to blurred local contours in the final cloud-removed images. In this context, a novel framework called heterogeneous parallel network for cloud removal (HPN-CR) is proposed to achieve high-quality cloud removal under extensive cloud coverage conditions. HPN-CR employs the proposed heterogeneous encoder with its SAR-optical input approach to effectively extract and fuse the local structural information in cloudy areas from SAR images, with the spectral information about land covers in cloud-free areas from the whole optical images. In particular, it uses a ResNet network with local spatial awareness to extract SAR features. It also uses the proposed Decloudformer, which globally models multiscale spatial correlations, to extract optical features. The output features are fused by the heterogeneous encoder and then reconstructed to cloud-removal images through a pixelshuffle-based decoder. Comprehensive experiments were conducted and the experimental results demonstrated the effectiveness and superiority of the proposed method.

Original languageEnglish
Article number5402115
JournalIEEE Transactions on Geoscience and Remote Sensing
Volume63
DOIs
Publication statusPublished - 2025

Keywords

  • Cloud removal
  • convolutional neural network (CNN)
  • data fusion
  • optical imagery
  • synthetic aperture radar (SAR)-optical
  • Transformer

Fingerprint

Dive into the research topics of 'HPN-CR: Heterogeneous Parallel Network for SAR-Optical Data Fusion Cloud Removal'. Together they form a unique fingerprint.

Cite this