摘要
Neural Machine Translation (NMT) brings promising improvements in translation quality, but until recently, these models rely on large-scale parallel corpora. As such corpora only exist on a handful of language pairs, the translation performance is far from the desired effect in the majority of low-resource languages. Thus, developing low-resource language translation techniques is crucial and it has become a popular research field in neural machine translation. In this article, we make an overall review of existing deep learning techniques in low-resource NMT. We first show the research status as well as some widely used low-resource datasets. Then, we categorize the existing methods and show some representative works detailedly. Finally, we summarize the common characters among them and outline the future directions in this field.
源语言 | 英语 |
---|---|
文章编号 | 103 |
期刊 | ACM Transactions on Asian and Low-Resource Language Information Processing |
卷 | 21 |
期 | 5 |
DOI | |
出版状态 | 已出版 - 15 11月 2022 |