Lifting the Curse of Capacity Gap in Distilling Language Models

Chen Zhang, Yang Yang, Jiahao Liu, Jingang Wang, Yunsen Xian, Benyou Wang, Dawei Song*

*此作品的通讯作者

科研成果: 书/报告/会议事项章节会议稿件同行评审

6 引用 (Scopus)

摘要

Pretrained language models (LMs) have shown compelling performance on various downstream tasks, but unfortunately they require a tremendous amount of inference compute. Knowledge distillation finds a path to compress LMs to small ones with a teacher-student paradigm. However, when the capacity gap between the teacher and the student is large, a curse of capacity gap appears, invoking a deficiency in distilling LMs. While a few studies have been carried out to fill the gap, the curse is not yet well tackled. In this paper, we aim at lifting the curse of capacity gap via enlarging the capacity of the student without notably increasing the inference compute. Largely motivated by sparse activation regime of mixture of experts (MOE), we propose a mixture of minimal experts (MINIMOE), which imposes extra parameters to the student but introduces almost no additional inference compute. Experimental results on GLUE and CoNLL demonstrate the curse of capacity gap is lifted by the magic of MINIMOE to a large extent. MINIMOE also achieves the state-of-the-art performance at small FLOPs compared with a range of competitive baselines. With a compression rate as much as ∼50×, MINIMOE preserves ∼95% GLUE score of the teacher.

源语言英语
主期刊名Long Papers
出版商Association for Computational Linguistics (ACL)
4535-4553
页数19
ISBN(电子版)9781959429722
出版状态已出版 - 2023
活动61st Annual Meeting of the Association for Computational Linguistics, ACL 2023 - Toronto, 加拿大
期限: 9 7月 202314 7月 2023

出版系列

姓名Proceedings of the Annual Meeting of the Association for Computational Linguistics
1
ISSN(印刷版)0736-587X

会议

会议61st Annual Meeting of the Association for Computational Linguistics, ACL 2023
国家/地区加拿大
Toronto
时期9/07/2314/07/23

指纹

探究 'Lifting the Curse of Capacity Gap in Distilling Language Models' 的科研主题。它们共同构成独一无二的指纹。

引用此