RESA: Relation Enhanced Self-Attention for Low-Resource Neural Machine Translation

Xing Wu, Shumin Shi*, Heyan Huang

*此作品的通讯作者

科研成果: 书/报告/会议事项章节会议稿件同行评审

3 引用 (Scopus)

摘要

Transformer-based Neural Machine Translation models have achieved impressive results on many translation tasks. In the meanwhile, some studies prove that extending syntax information can be explicitly incorporated to provide further improvements especially for some low-resource languages. In this paper, we propose RESA: the relation enhanced self-attention for Transformer which can integrate source side dependency syntax. More specifically, dependency parsing produces two kinds of information: dependency heads and relation labels, compared to the previous works only pay attention to dependency heads information, RESA use two methods to integrate relation labels as well: 1) Hard-way that uses a hyper parameter to control the information percentage after mapping relation labels sequence to continuous representations; 2) Gate-way that employs a gate mechanism to mix word information and relation labels information. We evaluate our methods on low-resource Chinese-Tibetan and Chinese-Mongol translation tasks, and the preliminary experimental results show that the proposed model achieves 0.93 and 0.68 BLEU scores gain compared to the baseline model.

源语言英语
主期刊名2021 International Conference on Asian Language Processing, IALP 2021
编辑Deyi Xiong, Ridong Jiang, Yanfeng Lu, Minghui Dong, Haizhou Li
出版商Institute of Electrical and Electronics Engineers Inc.
159-164
页数6
ISBN(电子版)9781665483117
DOI
出版状态已出版 - 2021
活动2021 International Conference on Asian Language Processing, IALP 2021 - Singapore, 新加坡
期限: 11 12月 202113 12月 2021

出版系列

姓名2021 International Conference on Asian Language Processing, IALP 2021

会议

会议2021 International Conference on Asian Language Processing, IALP 2021
国家/地区新加坡
Singapore
时期11/12/2113/12/21

指纹

探究 'RESA: Relation Enhanced Self-Attention for Low-Resource Neural Machine Translation' 的科研主题。它们共同构成独一无二的指纹。

引用此