摘要
This paper describes a translation model for ancient Chinese to modern Chinese and English for the Evahan 2023 competition, a subtask of the Ancient Language Translation 2023 challenge. During the training of our model, we applied various data augmentation techniques and used SiKu-RoBERTa as part of our model architecture. The results indicate that back translation improves the model's performance, but double back translation introduces noise and harms the model's performance. Fine-tuning on the original dataset can be helpful in solving the issue.
源语言 | 英语 |
---|---|
页 | 43-47 |
页数 | 5 |
出版状态 | 已出版 - 2023 |
活动 | 1st Workshop on Ancient Language Translation, ALT 2023, co-located with 19th Machine Translation Summit, MTS 2023 - Macau, 中国 期限: 5 9月 2023 → … |
会议
会议 | 1st Workshop on Ancient Language Translation, ALT 2023, co-located with 19th Machine Translation Summit, MTS 2023 |
---|---|
国家/地区 | 中国 |
市 | Macau |
时期 | 5/09/23 → … |