MODEL COMPRESSION VIA COLLABORATIVE DATA-FREE KNOWLEDGE DISTILLATION FOR EDGE INTELLIGENCE

Zhiwei Hao, Yong Luo, Zhi Wang, Han Hu*, Jianping An

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

5 Citations (Scopus)

Abstract

Model compression without the original data for fine-tuning is challenging for deploying large-size models on resource constrained edge devices. To this end, we propose a novel data-free model compression framework based on knowledge distillation (KD), where multiple teachers are utilized in a collaborative manner to enable reliable distillation. It mainly consists of three components: adversarial data generation, multi-teacher KD, and adaptive outputs aggregation. In particular, some synthesized data are generated in an adversarial manner to mimic the original data for model compression. Then a multi-header module is developed to simultaneously leverage diverse knowledge from multiple teachers. The distillation outputs are adaptively aggregated for final prediction. The experimental results demonstrate that our framework outperforms the data-free counterpart significantly (4.48% on MNIST and 2.96% on CIFAR-10). Effectiveness of different components of our method is also verified via carefully designed ablation study.

Original languageEnglish
Title of host publication2021 IEEE International Conference on Multimedia and Expo, ICME 2021
PublisherIEEE Computer Society
ISBN (Electronic)9781665438643
DOIs
Publication statusPublished - 2021
Event2021 IEEE International Conference on Multimedia and Expo, ICME 2021 - Shenzhen, China
Duration: 5 Jul 20219 Jul 2021

Publication series

NameProceedings - IEEE International Conference on Multimedia and Expo
ISSN (Print)1945-7871
ISSN (Electronic)1945-788X

Conference

Conference2021 IEEE International Conference on Multimedia and Expo, ICME 2021
Country/TerritoryChina
CityShenzhen
Period5/07/219/07/21

Keywords

  • Knowledge distillation
  • attention
  • data-free
  • edge intelligence
  • ensemble

Fingerprint

Dive into the research topics of 'MODEL COMPRESSION VIA COLLABORATIVE DATA-FREE KNOWLEDGE DISTILLATION FOR EDGE INTELLIGENCE'. Together they form a unique fingerprint.

Cite this

Hao, Z., Luo, Y., Wang, Z., Hu, H., & An, J. (2021). MODEL COMPRESSION VIA COLLABORATIVE DATA-FREE KNOWLEDGE DISTILLATION FOR EDGE INTELLIGENCE. In 2021 IEEE International Conference on Multimedia and Expo, ICME 2021 (Proceedings - IEEE International Conference on Multimedia and Expo). IEEE Computer Society. https://doi.org/10.1109/ICME51207.2021.9428308