Continual Domain Adaption for Neural Machine Translation

Manzhi Yang, Huaping Zhang, Chenxi Yu, Guotong Geng*

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Domain Neural Machine Translation (NMT) with small data- sets requires continual learning to incorporate new knowledge, as catastrophic forgetting is the main challenge that causes the model to forget old knowledge during fine-tuning. Additionally, most studies ignore the multi-stage domain adaptation of NMT. To address these issues, we propose a multi-stage incremental framework for domain NMT based on knowledge distillation. We also analyze how the supervised signals of the golden label and the teacher model work within a stage. Results show that the teacher model can only benefit the student model in the early epochs, while harms it in the later epochs. To solve this problem, we propose using two training objectives to encourage the early and later training. For early epochs, conventional continual learning is retained to fully leverage the teacher model and integrate old knowledge. For the later epochs, the bidirectional marginal loss is used to get rid of the negative impact of the teacher model. The experiments show that our method outperforms multiple continual learning methods, with an average improvement of 1.11 and 1.06 on two domain translation tasks.

Original languageEnglish
Title of host publicationNeural Information Processing - 30th International Conference, ICONIP 2023, Proceedings
EditorsBiao Luo, Long Cheng, Zheng-Guang Wu, Hongyi Li, Chaojie Li
PublisherSpringer Science and Business Media Deutschland GmbH
Pages427-439
Number of pages13
ISBN (Print)9789819981441
DOIs
Publication statusPublished - 2024
Event30th International Conference on Neural Information Processing, ICONIP 2023 - Changsha, China
Duration: 20 Nov 202323 Nov 2023

Publication series

NameCommunications in Computer and Information Science
Volume1965 CCIS
ISSN (Print)1865-0929
ISSN (Electronic)1865-0937

Conference

Conference30th International Conference on Neural Information Processing, ICONIP 2023
Country/TerritoryChina
CityChangsha
Period20/11/2323/11/23

Keywords

  • Continual learning
  • Knowledge distillation
  • Neural machine translation

Fingerprint

Dive into the research topics of 'Continual Domain Adaption for Neural Machine Translation'. Together they form a unique fingerprint.

Cite this