Low-resource Neural Machine Translation: Methods and Trends

Shumin Shi, Xing Wu, Rihai Su, Heyan Huang

Research output: Contribution to journalArticlepeer-review

10 Citations (Scopus)

Abstract

Neural Machine Translation (NMT) brings promising improvements in translation quality, but until recently, these models rely on large-scale parallel corpora. As such corpora only exist on a handful of language pairs, the translation performance is far from the desired effect in the majority of low-resource languages. Thus, developing low-resource language translation techniques is crucial and it has become a popular research field in neural machine translation. In this article, we make an overall review of existing deep learning techniques in low-resource NMT. We first show the research status as well as some widely used low-resource datasets. Then, we categorize the existing methods and show some representative works detailedly. Finally, we summarize the common characters among them and outline the future directions in this field.

Original languageEnglish
Article number103
JournalACM Transactions on Asian and Low-Resource Language Information Processing
Volume21
Issue number5
DOIs
Publication statusPublished - 15 Nov 2022

Keywords

  • Low-resource
  • data augmentation
  • neural machine translation
  • pivot-based methods
  • semi-supervised
  • transfer learning
  • unsupervised

Fingerprint

Dive into the research topics of 'Low-resource Neural Machine Translation: Methods and Trends'. Together they form a unique fingerprint.

Cite this