RESA: Relation Enhanced Self-Attention for Low-Resource Neural Machine Translation

Xing Wu, Shumin Shi*, Heyan Huang

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

3 Citations (Scopus)

Abstract

Transformer-based Neural Machine Translation models have achieved impressive results on many translation tasks. In the meanwhile, some studies prove that extending syntax information can be explicitly incorporated to provide further improvements especially for some low-resource languages. In this paper, we propose RESA: the relation enhanced self-attention for Transformer which can integrate source side dependency syntax. More specifically, dependency parsing produces two kinds of information: dependency heads and relation labels, compared to the previous works only pay attention to dependency heads information, RESA use two methods to integrate relation labels as well: 1) Hard-way that uses a hyper parameter to control the information percentage after mapping relation labels sequence to continuous representations; 2) Gate-way that employs a gate mechanism to mix word information and relation labels information. We evaluate our methods on low-resource Chinese-Tibetan and Chinese-Mongol translation tasks, and the preliminary experimental results show that the proposed model achieves 0.93 and 0.68 BLEU scores gain compared to the baseline model.

Original languageEnglish
Title of host publication2021 International Conference on Asian Language Processing, IALP 2021
EditorsDeyi Xiong, Ridong Jiang, Yanfeng Lu, Minghui Dong, Haizhou Li
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages159-164
Number of pages6
ISBN (Electronic)9781665483117
DOIs
Publication statusPublished - 2021
Event2021 International Conference on Asian Language Processing, IALP 2021 - Singapore, Singapore
Duration: 11 Dec 202113 Dec 2021

Publication series

Name2021 International Conference on Asian Language Processing, IALP 2021

Conference

Conference2021 International Conference on Asian Language Processing, IALP 2021
Country/TerritorySingapore
CitySingapore
Period11/12/2113/12/21

Keywords

  • Dependency Syntax
  • Low-Resource Neural Machine Translation
  • Self-Attention

Fingerprint

Dive into the research topics of 'RESA: Relation Enhanced Self-Attention for Low-Resource Neural Machine Translation'. Together they form a unique fingerprint.

Cite this