MODEL COMPRESSION VIA COLLABORATIVE DATA-FREE KNOWLEDGE DISTILLATION FOR EDGE INTELLIGENCE

Zhiwei Hao, Yong Luo, Zhi Wang, Han Hu*, Jianping An

*此作品的通讯作者

科研成果: 书/报告/会议事项章节会议稿件同行评审

5 引用 (Scopus)

摘要

Model compression without the original data for fine-tuning is challenging for deploying large-size models on resource constrained edge devices. To this end, we propose a novel data-free model compression framework based on knowledge distillation (KD), where multiple teachers are utilized in a collaborative manner to enable reliable distillation. It mainly consists of three components: adversarial data generation, multi-teacher KD, and adaptive outputs aggregation. In particular, some synthesized data are generated in an adversarial manner to mimic the original data for model compression. Then a multi-header module is developed to simultaneously leverage diverse knowledge from multiple teachers. The distillation outputs are adaptively aggregated for final prediction. The experimental results demonstrate that our framework outperforms the data-free counterpart significantly (4.48% on MNIST and 2.96% on CIFAR-10). Effectiveness of different components of our method is also verified via carefully designed ablation study.

源语言英语
主期刊名2021 IEEE International Conference on Multimedia and Expo, ICME 2021
出版商IEEE Computer Society
ISBN(电子版)9781665438643
DOI
出版状态已出版 - 2021
活动2021 IEEE International Conference on Multimedia and Expo, ICME 2021 - Shenzhen, 中国
期限: 5 7月 20219 7月 2021

出版系列

姓名Proceedings - IEEE International Conference on Multimedia and Expo
ISSN(印刷版)1945-7871
ISSN(电子版)1945-788X

会议

会议2021 IEEE International Conference on Multimedia and Expo, ICME 2021
国家/地区中国
Shenzhen
时期5/07/219/07/21

指纹

探究 'MODEL COMPRESSION VIA COLLABORATIVE DATA-FREE KNOWLEDGE DISTILLATION FOR EDGE INTELLIGENCE' 的科研主题。它们共同构成独一无二的指纹。

引用此