HPN-CR: Heterogeneous Parallel Network for SAR-Optical Data Fusion Cloud Removal

Panzhe Gu, Wenchao Liu, Shuyi Feng, Tianyu Wei, Jue Wang*, He Chen

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

SAR-optical data fusion cloud removal is a highly promising cloud removal technology that has attracted considerable attention. In this field, primary researches are based on deep learning, which can be divided into two categories: convolutional neural network (CNN)-based and Transformer-based. In cases of extensive cloud coverage, CNN-based methods, with their local spatial awareness, effectively capture local structural information in SAR data, preserving clear contours of cloud-removed land covers. However, these methods struggle to capture global land cover information in optical images, often resulting in notable color discrepancies between recovered and cloud-free regions. Conversely, Transformer-based methods, with their global modeling capability and inherent low-pass filtering properties, excel at capturing long-range spatial correlations in optical images, thereby maintaining color consistency across cloud-removal outputs. However, these methods are less effective at capturing the fine structural details in SAR data, which can lead to blurred local contours in the final cloud-removed images. In this context, a novel framework called heterogeneous parallel network for cloud removal (HPN-CR) is proposed to achieve high-quality cloud removal under extensive cloud coverage conditions. HPN-CR employs the proposed heterogeneous encoder with its SAR-optical input approach to effectively extract and fuse the local structural information in cloudy areas from SAR images, with the spectral information about land covers in cloud-free areas from the whole optical images. In particular, it uses a ResNet network with local spatial awareness to extract SAR features. It also uses the proposed Decloudformer, which globally models multi-scale spatial correlations, to extract optical features. The output features are fused by the heterogeneous encoder and then reconstructed to cloud-removal images through a pixelshuffle-based decoder. Comprehensive experiments were conducted and experimental results demonstrated the effectiveness and superiority of the proposed method. The code is available at https://github.com/G-pz/HPN-CR.

Original languageEnglish
JournalIEEE Transactions on Geoscience and Remote Sensing
DOIs
Publication statusAccepted/In press - 2025

Keywords

  • Cloud removal
  • CNN
  • data fusion
  • optical imagery
  • SAR-optical
  • Transformer

Fingerprint

Dive into the research topics of 'HPN-CR: Heterogeneous Parallel Network for SAR-Optical Data Fusion Cloud Removal'. Together they form a unique fingerprint.

Cite this

Gu, P., Liu, W., Feng, S., Wei, T., Wang, J., & Chen, H. (Accepted/In press). HPN-CR: Heterogeneous Parallel Network for SAR-Optical Data Fusion Cloud Removal. IEEE Transactions on Geoscience and Remote Sensing. https://doi.org/10.1109/TGRS.2025.3546489