TY - JOUR
T1 - Residual Spatial Attention Kernel Generation Network for Hyperspectral Image Classification with Small Sample Size
AU - Xu, Yanbing
AU - Zhang, Yanmei
AU - Yu, Chengcheng
AU - Ji, Chao
AU - Yue, Tingxuan
AU - Li, Huan
N1 - Publisher Copyright:
© 2022 IEEE.
PY - 2022
Y1 - 2022
N2 - With the rapid development of deep learning, the convolutional neural networks (CNNs) have been widely used in hyperspectral image classification (HSIC) and achieved excellent performance. However, CNNs reuse the same kernel weights over different locations, thus resulting in the insufficient capability of capturing diversity spatial interactions. Moreover, CNNs usually require a large amount of training samples to optimize the learnable parameters. When training samples are limited, the classification performance of CNN tends to drop off a cliff. To tackle the aforementioned issues, a novel residual spatial attention kernel generation network (RSAKGN) is proposed for HSIC. First, a spatial attention kernel generation module (SAKGM) is built to extract discriminative semantic features, which can dynamically calculate the attention weights to generate specific spatial attention kernels over different locations. Then, we combine the SAKGM with residual learning framework by embedding the SAKGM into a bottleneck residual block to obtain the residual spatial attention block (RSAB). The RSAKGN is constructed by stacking several RSABs. Experimental results on three public HSI datasets demonstrate that the proposed RSAKGN method outperforms several state-of-the-arts with small sample size.
AB - With the rapid development of deep learning, the convolutional neural networks (CNNs) have been widely used in hyperspectral image classification (HSIC) and achieved excellent performance. However, CNNs reuse the same kernel weights over different locations, thus resulting in the insufficient capability of capturing diversity spatial interactions. Moreover, CNNs usually require a large amount of training samples to optimize the learnable parameters. When training samples are limited, the classification performance of CNN tends to drop off a cliff. To tackle the aforementioned issues, a novel residual spatial attention kernel generation network (RSAKGN) is proposed for HSIC. First, a spatial attention kernel generation module (SAKGM) is built to extract discriminative semantic features, which can dynamically calculate the attention weights to generate specific spatial attention kernels over different locations. Then, we combine the SAKGM with residual learning framework by embedding the SAKGM into a bottleneck residual block to obtain the residual spatial attention block (RSAB). The RSAKGN is constructed by stacking several RSABs. Experimental results on three public HSI datasets demonstrate that the proposed RSAKGN method outperforms several state-of-the-arts with small sample size.
KW - Attention mechanism
KW - hyperspectral image classification (HSIC)
KW - residual learning framework
KW - small sample size
KW - spatial attention kernel generation module (SAKGM)
UR - http://www.scopus.com/inward/record.url?scp=85130458313&partnerID=8YFLogxK
U2 - 10.1109/TGRS.2022.3175494
DO - 10.1109/TGRS.2022.3175494
M3 - Article
AN - SCOPUS:85130458313
SN - 0196-2892
VL - 60
JO - IEEE Transactions on Geoscience and Remote Sensing
JF - IEEE Transactions on Geoscience and Remote Sensing
M1 - 5529714
ER -