Abstract
Neural machine translation (NMT) usually employs beam search to expand the searching space and obtain more translation candidates. However, the increase of the beam size often suffers from plenty of short translations, resulting in dramatical decrease in translation quality. In this paper, we handle the length bias problem through a perspective of causal inference. Specifically, we regard the model generated translation score S as a degraded true translation quality affected by some noise, and one of the confounders is the translation length. We apply a Half-Sibling Regression method to remove the length effect on S, and then we can obtain a debiased translation score without length information. The proposed method is model agnostic and unsupervised, which is adaptive to any NMT model and test dataset. We conduct the experiments on three translation tasks with different scales of datasets. Experimental results and further analyses show that our approaches gain comparable performance with the empirical baseline methods.
Original language | English |
---|---|
Pages | 874-885 |
Number of pages | 12 |
Publication status | Published - 2021 |
Event | 20th Chinese National Conference on Computational Linguistics, CCL 2021 - Hohhot, China Duration: 13 Aug 2021 → 15 Aug 2021 |
Conference
Conference | 20th Chinese National Conference on Computational Linguistics, CCL 2021 |
---|---|
Country/Territory | China |
City | Hohhot |
Period | 13/08/21 → 15/08/21 |