BCKD: Block-Correlation Knowledge Distillation

Qi Wang, Lu Liu, Wenxin Yu*, Shiyu Chen, Jun Gong, Peng Chen

*此作品的通讯作者

科研成果: 书/报告/会议事项章节会议稿件同行评审

1 引用 (Scopus)

摘要

In this paper, we propose Block-Correlation Knowledge Distillation (BCKD), a novel and efficient knowledge distillation method that differs from the classical method, using the simple multilayer-perceptron (MLP) and the classifier of the pre-trained teacher to train the correlations between adjacent blocks of the model. Over the past few years, the performance of some methods has been restricted by the feature map size or the lack of samples in small-scale datasets. By our proposed BCKD, the above problem is satisfactorily solved and has a superior performance without introducing additional overhead. Our method is validated on CIFAR100 and CI-FAR10 datasets, and experimental results demonstrate the effectiveness and superiority of our method.

源语言英语
主期刊名2023 IEEE International Conference on Image Processing, ICIP 2023 - Proceedings
出版商IEEE Computer Society
3225-3229
页数5
ISBN(电子版)9781728198354
DOI
出版状态已出版 - 2023
已对外发布
活动30th IEEE International Conference on Image Processing, ICIP 2023 - Kuala Lumpur, 马来西亚
期限: 8 10月 202311 10月 2023

出版系列

姓名Proceedings - International Conference on Image Processing, ICIP
ISSN(印刷版)1522-4880

会议

会议30th IEEE International Conference on Image Processing, ICIP 2023
国家/地区马来西亚
Kuala Lumpur
时期8/10/2311/10/23

指纹

探究 'BCKD: Block-Correlation Knowledge Distillation' 的科研主题。它们共同构成独一无二的指纹。

引用此