首页|语言模型蒸馏的低资源神经机器翻译方法

语言模型蒸馏的低资源神经机器翻译方法

扫码查看
大规模平行语料库的缺乏是低资源神经机器翻译面临的关键问题之一。提出语言模型蒸馏的神经机器翻译方法,通过单语语言模型对神经机器翻译训练进行正则化,引入语言模型包含的先验知识以提升翻译效果。具体地,借鉴知识蒸馏思想,使用丰富单语数据训练的目标端语言模型(教师模型)构造低资源神经机器翻译模型(学生模型)的正则化因子,让翻译模型学习到语言模型中高度泛化的先验知识。与传统单语语言模型融合参与解码过程不同的是,本文方法中的语言模型只在训练阶段使用,不参与推断阶段,因此能够有效提升解码速度。在第十七届全国机器翻译大会 CCMT2021 维吾尔语-汉语和藏语-汉语 2 种民汉低资源翻译数据集上的实验结果表明,相比目前最先进的语言模型融合方法,BLEU 提高了1。42%(藏汉方向)~2。11%(汉维方向)。
A neural machine translation method based on language model distillation
The lack of large parallel corpora is one of the key issues in low-resource neural machine translation.This paper proposes a neural machine translation method based on language model distil-lation,which regularizes neural machine translation training by using a monolingual language model.This method introduces prior knowledge contained in the language model to improve translation results.Specifically,we draw on the idea of knowledge distillation,and use the target-side language model(teacher model)trained on rich monolingual data to construct the regularization factor of the low-resource neural machine translation model(student model),allowing the translation model to learn highly generalized prior knowledge from the language model.Unlike traditional monolingual language models that participate in the decoding process,the language model in this method is only used during training and does not participate in the inference stage,so it can effectively improve decoding speed.Ex-perimental results on two low-resource translation datasets of Uyghur-Chinese and Tibetan-Chinese from the 17th national machine translation conference(CCMT2021)show that compared with the cur-rent state-of-the-art language model fusion baseline system,BLEU can be improved by 1.42 points(Tibetan-Chinese)to 2.11 points(Chinese-Uyghu).

language modelknowledge distillationregularizationlow-resource neural machine translation

申影利、赵小兵

展开 >

中央民族大学中国少数民族语言文学学院,北京 100081

国家语言资源监测与研究少数民族语言中心,北京 100081

中央民族大学信息工程学院,北京 100081

语言模型 知识蒸馏 正则化 低资源神经机器翻译

国家社会科学基金重大项目

22&ZD035

2024

计算机工程与科学
国防科学技术大学计算机学院

计算机工程与科学

CSTPCD北大核心
影响因子:0.787
ISSN:1007-130X
年,卷(期):2024.46(4)
  • 29