摘要
Currently, remote-sensing (RS) scene classification has played an important role in many practical applications. However, traditional methods that are based on deep convolutional neural networks (DCNNs) have several difficulties when faced with data shift problems: novel classes, varied orientations, and large intraclass variations of RS scene images. In this letter, we propose the rotation-invariant and discriminative-learning prototypical networks (RDPNs) for RS scene classification. RDPN uses A-ORConv32 basic blocks and attention mechanisms to obtain rotation-invariant and discriminative features. In addition, adaptive cosine center loss is proposed to constrain the features to mitigate the large intraclass variations and penalize the hard samples adaptively. We conduct extensive experiments on publicly available datasets and achieve 1.69%-19.38% higher accuracy than existing methods. The experimental results verify that the proposed RDPN can solve the data shift problems well in RS scene classification.
源语言 | 英语 |
---|---|
文章编号 | 6507105 |
期刊 | IEEE Geoscience and Remote Sensing Letters |
卷 | 19 |
DOI | |
出版状态 | 已出版 - 2022 |