BIT-ACT: An Ancient Chinese Translation System Using Data Augmentation

Li Zeng, Yanzhi Tian, Yingyu Shan, Yuhang Guo*

*此作品的通讯作者

科研成果: 会议稿件论文同行评审

摘要

This paper describes a translation model for ancient Chinese to modern Chinese and English for the Evahan 2023 competition, a subtask of the Ancient Language Translation 2023 challenge. During the training of our model, we applied various data augmentation techniques and used SiKu-RoBERTa as part of our model architecture. The results indicate that back translation improves the model's performance, but double back translation introduces noise and harms the model's performance. Fine-tuning on the original dataset can be helpful in solving the issue.

源语言英语
43-47
页数5
出版状态已出版 - 2023
活动1st Workshop on Ancient Language Translation, ALT 2023, co-located with 19th Machine Translation Summit, MTS 2023 - Macau, 中国
期限: 5 9月 2023 → …

会议

会议1st Workshop on Ancient Language Translation, ALT 2023, co-located with 19th Machine Translation Summit, MTS 2023
国家/地区中国
Macau
时期5/09/23 → …

指纹

探究 'BIT-ACT: An Ancient Chinese Translation System Using Data Augmentation' 的科研主题。它们共同构成独一无二的指纹。

引用此