A Two-Stage Distillation Method: Teach Model to Answer Questions After Comprehending the Document

Ruiqing Sun, Ping Jian*

*此作品的通讯作者

科研成果: 书/报告/会议事项章节会议稿件同行评审

1 引用 (Scopus)

摘要

Multi-choice Machine Reading Comprehension (MRC) is a challenging extension of Natural Language Processing (NLP) that requires the ability to comprehend the semantics and logical relationships between entities in a given text. The MRC task has traditionally been viewed as a process of answering questions based on the given text. This single-stage approach has often led the network to concentrate on generating the correct answer, potentially neglecting the comprehension of the text itself. As a result, many prevalent models have faced challenges in performing well on this task when dealing with longer texts. In this paper, we propose a two-stage knowledge distillation method that teaches the model to better comprehend the document by dividing the MRC task into two separate stages. Our experimental results show that the student model, when equipped with our method, achieves significant improvements, demonstrating the effectiveness of our method.

源语言英语
主期刊名Proceedings of 2023 International Conference on Asian Language Processing, IALP 2023
编辑Lei Wang, Yanfeng Lu, Minghui Dong
出版商Institute of Electrical and Electronics Engineers Inc.
240-245
页数6
ISBN(电子版)9798350330786
DOI
出版状态已出版 - 2023
活动27th International Conference on Asian Language Processing, IALP 2023 - Singapore, 新加坡
期限: 18 11月 202320 11月 2023

出版系列

姓名Proceedings of 2023 International Conference on Asian Language Processing, IALP 2023

会议

会议27th International Conference on Asian Language Processing, IALP 2023
国家/地区新加坡
Singapore
时期18/11/2320/11/23

指纹

探究 'A Two-Stage Distillation Method: Teach Model to Answer Questions After Comprehending the Document' 的科研主题。它们共同构成独一无二的指纹。

引用此