TY - GEN
T1 - A Knowledge Enhanced Chinese GaoKao Reading Comprehension Method
AU - Zhang, Xiao
AU - Zheng, Heqi
AU - Huang, Heyan
AU - Chi, Zewen
AU - Mao, Xian Ling
N1 - Publisher Copyright:
© 2021 IEEE.
PY - 2021
Y1 - 2021
N2 - Chinese GaoKao Reading Comprehension is a chal-lenging NLP task. It requires strong logical reasoning ability to capture deep semantic relations between the questions and answers. However, most traditional models cannot learn sufficient inference ability, because of the scarcity of Chinese GaoKao reading comprehension data. Intuitively, there are two methods to improve the reading comprehension ability for Chinese GaoKao reading comprehension task. 1). Increase the scale of data. 2). Introduce additional related knowledge. In this paper, we propose a novel method based on adversarial training and knowledge distillation, which can be trained in other knowledge-rich datasets and transferred to the Chinese GaoKao reading comprehension task. Extensive experiments show that our proposed model performs better than the state-of-the-art baselines. The code and the relevant dataset will be publicly avaible.
AB - Chinese GaoKao Reading Comprehension is a chal-lenging NLP task. It requires strong logical reasoning ability to capture deep semantic relations between the questions and answers. However, most traditional models cannot learn sufficient inference ability, because of the scarcity of Chinese GaoKao reading comprehension data. Intuitively, there are two methods to improve the reading comprehension ability for Chinese GaoKao reading comprehension task. 1). Increase the scale of data. 2). Introduce additional related knowledge. In this paper, we propose a novel method based on adversarial training and knowledge distillation, which can be trained in other knowledge-rich datasets and transferred to the Chinese GaoKao reading comprehension task. Extensive experiments show that our proposed model performs better than the state-of-the-art baselines. The code and the relevant dataset will be publicly avaible.
KW - Adversarial training
KW - Chinese reading comprehension
KW - Knowledge distilling
UR - http://www.scopus.com/inward/record.url?scp=85125066829&partnerID=8YFLogxK
U2 - 10.1109/ICKG52313.2021.00053
DO - 10.1109/ICKG52313.2021.00053
M3 - Conference contribution
AN - SCOPUS:85125066829
T3 - Proceedings - 12th IEEE International Conference on Big Knowledge, ICBK 2021
SP - 347
EP - 352
BT - Proceedings - 12th IEEE International Conference on Big Knowledge, ICBK 2021
A2 - Gong, Zhiguo
A2 - Li, Xue
A2 - Oguducu, Sule Gunduz
A2 - Chen, Lei
A2 - Manjon, Baltasar Fernandez
A2 - Wu, Xindong
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 12th IEEE International Conference on Big Knowledge, ICBK 2021
Y2 - 7 December 2021 through 8 December 2021
ER -