Abstract
This paper describes a translation model for ancient Chinese to modern Chinese and English for the Evahan 2023 competition, a subtask of the Ancient Language Translation 2023 challenge. During the training of our model, we applied various data augmentation techniques and used SiKu-RoBERTa as part of our model architecture. The results indicate that back translation improves the model's performance, but double back translation introduces noise and harms the model's performance. Fine-tuning on the original dataset can be helpful in solving the issue.
Original language | English |
---|---|
Pages | 43-47 |
Number of pages | 5 |
Publication status | Published - 2023 |
Event | 1st Workshop on Ancient Language Translation, ALT 2023, co-located with 19th Machine Translation Summit, MTS 2023 - Macau, China Duration: 5 Sept 2023 → … |
Conference
Conference | 1st Workshop on Ancient Language Translation, ALT 2023, co-located with 19th Machine Translation Summit, MTS 2023 |
---|---|
Country/Territory | China |
City | Macau |
Period | 5/09/23 → … |