A Two-Stage Distillation Method: Teach Model to Answer Questions After Comprehending the Document

Ruiqing Sun, Ping Jian*

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Citation (Scopus)

Abstract

Multi-choice Machine Reading Comprehension (MRC) is a challenging extension of Natural Language Processing (NLP) that requires the ability to comprehend the semantics and logical relationships between entities in a given text. The MRC task has traditionally been viewed as a process of answering questions based on the given text. This single-stage approach has often led the network to concentrate on generating the correct answer, potentially neglecting the comprehension of the text itself. As a result, many prevalent models have faced challenges in performing well on this task when dealing with longer texts. In this paper, we propose a two-stage knowledge distillation method that teaches the model to better comprehend the document by dividing the MRC task into two separate stages. Our experimental results show that the student model, when equipped with our method, achieves significant improvements, demonstrating the effectiveness of our method.

Original languageEnglish
Title of host publicationProceedings of 2023 International Conference on Asian Language Processing, IALP 2023
EditorsLei Wang, Yanfeng Lu, Minghui Dong
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages240-245
Number of pages6
ISBN (Electronic)9798350330786
DOIs
Publication statusPublished - 2023
Event27th International Conference on Asian Language Processing, IALP 2023 - Singapore, Singapore
Duration: 18 Nov 202320 Nov 2023

Publication series

NameProceedings of 2023 International Conference on Asian Language Processing, IALP 2023

Conference

Conference27th International Conference on Asian Language Processing, IALP 2023
Country/TerritorySingapore
CitySingapore
Period18/11/2320/11/23

Keywords

  • Knowledge Distillation
  • Multi-choice Machine Reading Comprehension
  • Semantic Comprehension
  • Two-stage Distillation

Fingerprint

Dive into the research topics of 'A Two-Stage Distillation Method: Teach Model to Answer Questions After Comprehending the Document'. Together they form a unique fingerprint.

Cite this