RDPN: Tackling the Data Shift Problem in Remote Sensing Scene Classification

Xiang Zhang, Xin Wei, Ning Zhang, Wenchao Liu*, Yizhuang Xie

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

2 Citations (Scopus)

Abstract

Currently, remote-sensing (RS) scene classification has played an important role in many practical applications. However, traditional methods that are based on deep convolutional neural networks (DCNNs) have several difficulties when faced with data shift problems: novel classes, varied orientations, and large intraclass variations of RS scene images. In this letter, we propose the rotation-invariant and discriminative-learning prototypical networks (RDPNs) for RS scene classification. RDPN uses A-ORConv32 basic blocks and attention mechanisms to obtain rotation-invariant and discriminative features. In addition, adaptive cosine center loss is proposed to constrain the features to mitigate the large intraclass variations and penalize the hard samples adaptively. We conduct extensive experiments on publicly available datasets and achieve 1.69%-19.38% higher accuracy than existing methods. The experimental results verify that the proposed RDPN can solve the data shift problems well in RS scene classification.

Original languageEnglish
Article number6507105
JournalIEEE Geoscience and Remote Sensing Letters
Volume19
DOIs
Publication statusPublished - 2022

Keywords

  • A-ORConv
  • attention
  • data shift problem
  • intraclass variations
  • remote sensing (RS) scene classification

Fingerprint

Dive into the research topics of 'RDPN: Tackling the Data Shift Problem in Remote Sensing Scene Classification'. Together they form a unique fingerprint.

Cite this