FPD: Feature Pyramid Knowledge Distillation

Qi Wang, Lu Liu, Wenxin Yu*, Zhiqiang Zhang, Yuxin Liu, Shiyu Cheng, Xuewen Zhang, Jun Gong

*此作品的通讯作者

科研成果: 书/报告/会议事项章节会议稿件同行评审

1 引用 (Scopus)

摘要

Knowledge distillation is a commonly used method for model compression, aims to compress a powerful yet cumbersome model into a lightweight model without much sacrifice of performance, giving the accuracy of a lightweight model close to that of the cumbersome model. Commonly, the efficient but bulky model is called the teacher model and the lightweight model is called the student model. For this purpose, various approaches have been proposed over the past few years. Some classical distillation methods are mainly based on distilling deep features from the intermediate layer or the logits layer, and some methods combine knowledge distillation with contrastive learning. However, classical distillation methods have a significant gap in feature representation between teacher and student, and contrastive learning distillation methods also need massive diversified data for training. For above these issues, our study aims to narrow the gap in feature representation between teacher and student and obtain more feature representation from images in limited datasets to achieve better performance. In addition, the superiority of our method is all validated on a generalized dataset (CIFAR-100) and a small-scale dataset (CIFAR-10). On CIFAR-100, we achieve 19.21%, 20.01% of top-1 error with Resnet50 and Resnet18, respectively. Especially, Resnet50 and Resnet18 as student model achieves better performance than the pre-trained Resnet152 and Resnet34 teacher model. On CIFAR-10, we perform 4.22% of top-1 error with Resnet-18. Whether on CIFAR-10 or CIFAR-100, we all achieve better performance, and even the student model performs better than the teacher.

源语言英语
主期刊名Neural Information Processing - 29th International Conference, ICONIP 2022, Proceedings
编辑Mohammad Tanveer, Sonali Agarwal, Seiichi Ozawa, Asif Ekbal, Adam Jatowt
出版商Springer Science and Business Media Deutschland GmbH
100-111
页数12
ISBN(印刷版)9783031301049
DOI
出版状态已出版 - 2023
已对外发布
活动29th International Conference on Neural Information Processing, ICONIP 2022 - Virtual, Online
期限: 22 11月 202226 11月 2022

出版系列

姓名Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
13623 LNCS
ISSN(印刷版)0302-9743
ISSN(电子版)1611-3349

会议

会议29th International Conference on Neural Information Processing, ICONIP 2022
Virtual, Online
时期22/11/2226/11/22

指纹

探究 'FPD: Feature Pyramid Knowledge Distillation' 的科研主题。它们共同构成独一无二的指纹。

引用此