BCKD: Block-Correlation Knowledge Distillation

Qi Wang, Lu Liu, Wenxin Yu*, Shiyu Chen, Jun Gong, Peng Chen

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Citation (Scopus)

Abstract

In this paper, we propose Block-Correlation Knowledge Distillation (BCKD), a novel and efficient knowledge distillation method that differs from the classical method, using the simple multilayer-perceptron (MLP) and the classifier of the pre-trained teacher to train the correlations between adjacent blocks of the model. Over the past few years, the performance of some methods has been restricted by the feature map size or the lack of samples in small-scale datasets. By our proposed BCKD, the above problem is satisfactorily solved and has a superior performance without introducing additional overhead. Our method is validated on CIFAR100 and CI-FAR10 datasets, and experimental results demonstrate the effectiveness and superiority of our method.

Original languageEnglish
Title of host publication2023 IEEE International Conference on Image Processing, ICIP 2023 - Proceedings
PublisherIEEE Computer Society
Pages3225-3229
Number of pages5
ISBN (Electronic)9781728198354
DOIs
Publication statusPublished - 2023
Externally publishedYes
Event30th IEEE International Conference on Image Processing, ICIP 2023 - Kuala Lumpur, Malaysia
Duration: 8 Oct 202311 Oct 2023

Publication series

NameProceedings - International Conference on Image Processing, ICIP
ISSN (Print)1522-4880

Conference

Conference30th IEEE International Conference on Image Processing, ICIP 2023
Country/TerritoryMalaysia
CityKuala Lumpur
Period8/10/2311/10/23

Keywords

  • Block
  • Classifier
  • Correlation
  • Knowledge Distillation
  • Multilayer Perceptron

Fingerprint

Dive into the research topics of 'BCKD: Block-Correlation Knowledge Distillation'. Together they form a unique fingerprint.

Cite this