Multimodal Uncertainty Robust Tree Cover Segmentation for High-Resolution Remote Sensing Images

  • Yuanyuan Gui
  • , Wei Li*
  • , Yinjian Wang
  • , Xiang Gen Xia
  • , Mauro Marty
  • , Christian Ginzler
  • , Zuyuan Wang*
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Recentadvances in semantic segmentation of multimodal remote sensing images have significantly improved the accuracy of tree cover mapping, supporting applications in urban planning, forest monitoring, and ecological assessment. Integrating data from multiple modalities—such as optical imagery, light detection and ranging (LiDAR), and synthetic aperture radar (SAR)—has shown superior performance over single-modality methods. However, these data are often acquired days or even months apart, during which various changes may occur, such as vegetation disturbances (e.g., logging, and wildfires) and variations in imaging quality. Such temporal misalignments introduce cross-modal uncertainty, especially in high-resolution imagery, which can severely degrade segmentation accuracy. To address this challenge, we propose MURTreeFormer, a novel multimodal segmentation framework that mitigates and leverages aleatoric uncertainty for robust tree cover mapping. MURTreeFormer treats one modality as primary and others as auxiliary, explicitly modeling patch-level uncertainty in the auxiliary modalities via a probabilistic latent representation. Uncertain patches are identified and reconstructed from the primary modality’s distribution through a VAE-based resampling mechanism, producing enhanced auxiliary features for fusion. In the decoder, a gradient magnitude attention (GMA) module and a lightweight refinement head (RH) are further integrated to guide attention toward treelike structures and to preserve fine-grained spatial details. Extensive experiments on multimodal datasets from Shanghai and Zurich demonstrate that MURTreeFormer significantly improves segmentation performance and effectively reduces the impact of temporally induced aleatoric uncertainty.

Original languageEnglish
Pages (from-to)114-128
Number of pages15
JournalIEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing
Volume19
DOIs
Publication statusPublished - 2026
Externally publishedYes

Keywords

  • Multimodel
  • semantic segmentation
  • tree cover mapping
  • uncertainty noise

Fingerprint

Dive into the research topics of 'Multimodal Uncertainty Robust Tree Cover Segmentation for High-Resolution Remote Sensing Images'. Together they form a unique fingerprint.

Cite this