Abstract
Currently, remote-sensing (RS) scene classification has played an important role in many practical applications. However, traditional methods that are based on deep convolutional neural networks (DCNNs) have several difficulties when faced with data shift problems: novel classes, varied orientations, and large intraclass variations of RS scene images. In this letter, we propose the rotation-invariant and discriminative-learning prototypical networks (RDPNs) for RS scene classification. RDPN uses A-ORConv32 basic blocks and attention mechanisms to obtain rotation-invariant and discriminative features. In addition, adaptive cosine center loss is proposed to constrain the features to mitigate the large intraclass variations and penalize the hard samples adaptively. We conduct extensive experiments on publicly available datasets and achieve 1.69%-19.38% higher accuracy than existing methods. The experimental results verify that the proposed RDPN can solve the data shift problems well in RS scene classification.
Original language | English |
---|---|
Article number | 6507105 |
Journal | IEEE Geoscience and Remote Sensing Letters |
Volume | 19 |
DOIs | |
Publication status | Published - 2022 |
Keywords
- A-ORConv
- attention
- data shift problem
- intraclass variations
- remote sensing (RS) scene classification