Skip to main navigation Skip to search Skip to main content

Remote Sensing Collaborative Classification Using Multimodal Adaptive Modulation Network

  • Mengmeng Zhang
  • , Yuyang Zhao
  • , Rongjie Chen
  • , Yunhao Gao*
  • , Zhengmao Li
  • , Wei Li
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

With the development of remote sensing technology, more and more data sources are available for landcover classification tasks, such as hyperspectral images (HSIs), light detection and ranging (LiDAR) data, and synthetic aperture radar (SAR) data. Due to the unique information carried by different sources, the collaborative use of multiple remote sensing data has become a key research direction in landcover classification tasks. Most existing methods are only capable of dealing with two types of remote-sensing images, limiting the potential for the collaboration of more sources. Also, the traditional feature fusion method uses addition or concatenation manner to integrate information, which makes it difficult to make full use of complement characteristics between modalities. As a remedy, we propose a novel multimodal adaptive modulation network (MAMNet), for landcover classification tasks using multimodal remote sensing data. First, the cross-modal interacting module (CIM) is utilized for information absorption between modalities. The feature representation is enhanced, and the modality-specific information is preserved. Second, the modal attention layer (MAL) is designed for multimodal feature fusion. Softmax attention is utilized to eliminate redundant information among the three modal features. Finally, an adaptive multimodal margin loss (AMM loss) is proposed to balance the consistency and diversity of multimodal features. It encourages adjustable decision margins between sources, which enables the model to better utilize complementary information between modalities and partially avoids model-overfitting by defining a more difficult learning target. Experimental results on two benchmark remote sensing datasets show the effectiveness of the proposed method compared with several state-of-the-art approaches.

Original languageEnglish
Article number5529512
JournalIEEE Transactions on Geoscience and Remote Sensing
Volume62
DOIs
Publication statusPublished - 2024

Keywords

  • Adaptive multimodal margin loss (AMM loss)
  • feature fusion
  • modal attention
  • multimodal classification
  • multisource remote sensing

Fingerprint

Dive into the research topics of 'Remote Sensing Collaborative Classification Using Multimodal Adaptive Modulation Network'. Together they form a unique fingerprint.

Cite this