BIT-ACT: An Ancient Chinese Translation System Using Data Augmentation

Li Zeng, Yanzhi Tian, Yingyu Shan, Yuhang Guo*

*Corresponding author for this work

Research output: Contribution to conferencePaperpeer-review

Abstract

This paper describes a translation model for ancient Chinese to modern Chinese and English for the Evahan 2023 competition, a subtask of the Ancient Language Translation 2023 challenge. During the training of our model, we applied various data augmentation techniques and used SiKu-RoBERTa as part of our model architecture. The results indicate that back translation improves the model's performance, but double back translation introduces noise and harms the model's performance. Fine-tuning on the original dataset can be helpful in solving the issue.

Original languageEnglish
Pages43-47
Number of pages5
Publication statusPublished - 2023
Event1st Workshop on Ancient Language Translation, ALT 2023, co-located with 19th Machine Translation Summit, MTS 2023 - Macau, China
Duration: 5 Sept 2023 → …

Conference

Conference1st Workshop on Ancient Language Translation, ALT 2023, co-located with 19th Machine Translation Summit, MTS 2023
Country/TerritoryChina
CityMacau
Period5/09/23 → …

Fingerprint

Dive into the research topics of 'BIT-ACT: An Ancient Chinese Translation System Using Data Augmentation'. Together they form a unique fingerprint.

Cite this