Sparse Teachers Can Be Dense with Knowledge

Yi Yang, Chen Zhang, Dawei Song*

*此作品的通讯作者

科研成果: 会议稿件论文同行评审

4 引用 (Scopus)

摘要

Recent advances in distilling pretrained language models have discovered that, besides the expressiveness of knowledge, the student-friendliness should be taken into consideration to realize a truly knowledgeable teacher. Based on a pilot study, we find that over-parameterized teachers can produce expressive yet student-unfriendly knowledge and are thus limited in overall knowledgeableness. To remove the parameters that result in student-unfriendliness, we propose a sparse teacher trick under the guidance of an overall knowledgeable score for each teacher parameter. The knowledgeable score is essentially an interpolation of the expressiveness and student-friendliness scores. The aim is to ensure that the expressive parameters are retained while the student-unfriendly ones are removed. Extensive experiments on the GLUE benchmark show that the proposed sparse teachers can be dense with knowledge and lead to students with compelling performance in comparison with a series of competitive baselines.

源语言英语
3904-3915
页数12
出版状态已出版 - 2022
活动2022 Conference on Empirical Methods in Natural Language Processing, EMNLP 2022 - Abu Dhabi, 阿拉伯联合酋长国
期限: 7 12月 202211 12月 2022

会议

会议2022 Conference on Empirical Methods in Natural Language Processing, EMNLP 2022
国家/地区阿拉伯联合酋长国
Abu Dhabi
时期7/12/2211/12/22

指纹

探究 'Sparse Teachers Can Be Dense with Knowledge' 的科研主题。它们共同构成独一无二的指纹。

引用此

Yang, Y., Zhang, C., & Song, D. (2022). Sparse Teachers Can Be Dense with Knowledge. 3904-3915. 论文发表于 2022 Conference on Empirical Methods in Natural Language Processing, EMNLP 2022, Abu Dhabi, 阿拉伯联合酋长国.