TY - JOUR
T1 - NAS-Based CNN Channel Pruning for Remote Sensing Scene Classification
AU - Wei, Xin
AU - Zhang, Ning
AU - Liu, Wenchao
AU - Chen, He
N1 - Publisher Copyright:
© 2004-2012 IEEE.
PY - 2022
Y1 - 2022
N2 - Recently, convolutional neural network (CNN)-based remote sensing scene classification has achieved great success. However, the prohibitively expensive computation and storage requirements of state-of-the-art models have hindered the deployment of CNNs on on- board platforms. In this letter, we propose a differentiable neural architecture search (NAS)-based channel pruning method to automatically prune the CNN models. In the proposed method, the importance of each output channel is measured by a trainable score. The scores are optimized by an NAS method to search a good-performance pruned structure. After the search process, a global score threshold is adopted to derive the pruned model. A cost-awareness loss is proposed for the search process to encourage the floating-point operation (FLOP) compression ratio of the pruned model coverage to a desired value. We apply the proposed method to ResNet-34 and VGG-16 to verify the performance. The NWPU-RESISC-45 and UC Merced Land-Use (UCM) datasets are used for the performance evaluation. A comparison with state-of-the-art pruning methods demonstrates that the proposed method can achieve competitive performance with a similar reduction in FLOP.
AB - Recently, convolutional neural network (CNN)-based remote sensing scene classification has achieved great success. However, the prohibitively expensive computation and storage requirements of state-of-the-art models have hindered the deployment of CNNs on on- board platforms. In this letter, we propose a differentiable neural architecture search (NAS)-based channel pruning method to automatically prune the CNN models. In the proposed method, the importance of each output channel is measured by a trainable score. The scores are optimized by an NAS method to search a good-performance pruned structure. After the search process, a global score threshold is adopted to derive the pruned model. A cost-awareness loss is proposed for the search process to encourage the floating-point operation (FLOP) compression ratio of the pruned model coverage to a desired value. We apply the proposed method to ResNet-34 and VGG-16 to verify the performance. The NWPU-RESISC-45 and UC Merced Land-Use (UCM) datasets are used for the performance evaluation. A comparison with state-of-the-art pruning methods demonstrates that the proposed method can achieve competitive performance with a similar reduction in FLOP.
KW - Channel pruning
KW - convolutional neural network (CNN)
KW - neural architecture search (NAS)
KW - remote sensing scene classification
UR - http://www.scopus.com/inward/record.url?scp=85128258113&partnerID=8YFLogxK
U2 - 10.1109/LGRS.2022.3165841
DO - 10.1109/LGRS.2022.3165841
M3 - Article
AN - SCOPUS:85128258113
SN - 1545-598X
VL - 19
JO - IEEE Geoscience and Remote Sensing Letters
JF - IEEE Geoscience and Remote Sensing Letters
M1 - 6508605
ER -