Multiscale Neighborhood Information Fusion Network for Classification of Remote Sensing LiDAR Images

Jiao Dong, Kaiqi Liu*, Jiawei Han, Mengmeng Zhang, Xudong Zhao, Wei Li, Li Xiong, Mengbin Rao

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)

Abstract

In the field of remote sensing (RS) data processing, light detection and ranging (LiDAR)-derived digital surface model (DSM), which is capable of effectively reflecting elevation information, has significant value in earth observation applications. In this work, we proposed a multiscale neighborhood information fusion (MNIF) network for classification of RS LiDAR images. Specifically, they capture the details and overall characteristics of the data through multiscale patches simultaneously, with the information from different scales complementing one another. Subsequently, a spatial-aware region (SR) attention module is utilized to emphasize distinctive features specific to each category. Furthermore, since the input feature maps from different scales encompass distinct neighborhood information, the cross-scale convolutional kernels are incorporated into the designed optimized feature extractor (OFE). Experimental results based on two DSM datasets named Houston and Trento in the 2013 GRSS Data Fusion Competition demonstrate that the proposed MNIF network can achieve superior classification performance compared with several existing methods.

Original languageEnglish
Pages (from-to)16601-16613
Number of pages13
JournalIEEE Sensors Journal
Volume24
Issue number10
DOIs
Publication statusPublished - 15 May 2024

Keywords

  • Data classification
  • deep learning
  • feature extraction
  • light detection and ranging (LiDAR)
  • multiscale input

Fingerprint

Dive into the research topics of 'Multiscale Neighborhood Information Fusion Network for Classification of Remote Sensing LiDAR Images'. Together they form a unique fingerprint.

Cite this