TY - JOUR
T1 - Knowledge Distillation-Based Lightweight Change Detection in High-Resolution Remote Sensing Imagery for On-Board Processing
AU - Wang, Guoqing
AU - Zhang, Ning
AU - Wang, Jue
AU - Liu, Wenchao
AU - Xie, Yizhuang
AU - Chen, He
N1 - Publisher Copyright:
© 2024 The Authors.
PY - 2024
Y1 - 2024
N2 - Deep learning (DL) has been introduced to change detection (CD) due to its powerful feature representation and robust generalization abilities. However, the application of large DL models has high computational complexity and massive storage requirements for achieving good performance. For disaster emergency response and other applications with high timeliness requirements, it is difficult to deploy large DL models on spaceborne edge devices with limited resources to achieve on-board CD processing. To address this limitation, a novel CD based on knowledge distillation (CDKD) method that combines prototypical contrastive distillation and channel-spatial-normalized (CSN) distillation is proposed. PC distillation represents the feature distribution by calculating the differences between the similarities of pixel features and their positive and negative prototypes, and improves the student model’s detection ability in changed regions that have similar features to the background by mimicking the relative feature distribution. CSN distillation combines two distillation paradigms, channel normalization and spatial normalization, and guides the student model to comprehensively learn the knowledge contained in the output probabilities of the teacher model to accurately identify changed regions with complex shapes. The effectiveness and reliability of the proposed CDKD method are verified on three public remote sensing CD datasets, and extensive experiments and analyses show that the proposed CDKD method can be used to train lightweight models with comparable performance to that of large models.
AB - Deep learning (DL) has been introduced to change detection (CD) due to its powerful feature representation and robust generalization abilities. However, the application of large DL models has high computational complexity and massive storage requirements for achieving good performance. For disaster emergency response and other applications with high timeliness requirements, it is difficult to deploy large DL models on spaceborne edge devices with limited resources to achieve on-board CD processing. To address this limitation, a novel CD based on knowledge distillation (CDKD) method that combines prototypical contrastive distillation and channel-spatial-normalized (CSN) distillation is proposed. PC distillation represents the feature distribution by calculating the differences between the similarities of pixel features and their positive and negative prototypes, and improves the student model’s detection ability in changed regions that have similar features to the background by mimicking the relative feature distribution. CSN distillation combines two distillation paradigms, channel normalization and spatial normalization, and guides the student model to comprehensively learn the knowledge contained in the output probabilities of the teacher model to accurately identify changed regions with complex shapes. The effectiveness and reliability of the proposed CDKD method are verified on three public remote sensing CD datasets, and extensive experiments and analyses show that the proposed CDKD method can be used to train lightweight models with comparable performance to that of large models.
KW - Change detection (CD)
KW - feature distribution
KW - knowledge distillation (KD)
KW - model compression and acceleration
KW - probability distribution
UR - http://www.scopus.com/inward/record.url?scp=85182935656&partnerID=8YFLogxK
U2 - 10.1109/JSTARS.2024.3354944
DO - 10.1109/JSTARS.2024.3354944
M3 - Article
AN - SCOPUS:85182935656
SN - 1939-1404
VL - 17
SP - 3860
EP - 3877
JO - IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing
JF - IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing
ER -