MSGFNet: Multi-Scale Gated Fusion Network for Remote Sensing Image Change Detection

Yukun Wang, Mengmeng Wang*, Zhonghu Hao, Qiang Wang, Qianwen Wang, Yuanxin Ye

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

2 Citations (Scopus)

Abstract

Change detection (CD) stands out as a pivotal yet challenging task in the interpretation of remote sensing images. Significant developments have been witnessed, particularly with the rapid advancements in deep learning techniques. Nevertheless, challenges such as incomplete detection targets and unsmooth boundaries remain as most CD methods suffer from ineffective feature fusion. Therefore, this paper presents a multi-scale gated fusion network (MSGFNet) to improve the accuracy of CD results. To effectively extract bi-temporal features, the EfficientNetB4 model based on a Siamese network is employed. Subsequently, we propose a multi-scale gated fusion module (MSGFM) that comprises a multi-scale progressive fusion (MSPF) unit and a gated weight adaptive fusion (GWAF) unit, aimed at fusing bi-temporal multi-scale features to maintain boundary details and detect completely changed targets. Finally, we use the simple yet efficient UNet structure to recover the feature maps and predict results. To demonstrate the effectiveness of the MSGFNet, the LEVIR-CD, WHU-CD, and SYSU-CD datasets were utilized, and the MSGFNet achieved F1 scores of 90.86%, 92.46%, and 80.39% on the three datasets, respectively. Furthermore, the low computational costs and small model size have validated the superior performance of the MSGFNet.

Original languageEnglish
Article number572
JournalRemote Sensing
Volume16
Issue number3
DOIs
Publication statusPublished - Feb 2024

Keywords

  • change detection
  • gated weight adaptive fusion
  • multi-scale progressive fusion
  • remote sensing images

Fingerprint

Dive into the research topics of 'MSGFNet: Multi-Scale Gated Fusion Network for Remote Sensing Image Change Detection'. Together they form a unique fingerprint.

Cite this