首页|Reports Outline Machine Translation Study Results from Minzu University of China (Unsupervised Multilingual Machine Translation With Pretrained Cross-lingual Encoders)

Reports Outline Machine Translation Study Results from Minzu University of China (Unsupervised Multilingual Machine Translation With Pretrained Cross-lingual Encoders)

扫码查看
Data detailed on Machine Translation have been presented. According to news originating from Beijing, People's Republic of China, by NewsRx correspondents, research stated, “Multilingual Neural Machine Translation (MNMT) has recently made great progress in training models that can translate between multiple languages. However, MNMT faces a significant challenge due to the lack of sufficient parallel corpora for all language pairs.” Financial supporters for this research include Chinese National Funding of Social Sciences, National Language Commis-sion Foundation of China. Our news journalists obtained a quote from the research from the Minzu University of China, “Unsupervised machine translation methods, which utilize monolingual data, have emerged as a solution to this challenge. In this paper, we propose a method that leverages cross-lingual encoders, such as XLM-R, in an unsupervised manner (i.e., using monolingual data and bilingual dictionaries) to train a MNMT model. Our method initializes the MNMT model with a pre-trained cross-lingual encoder and employs two levels of alignment to further align the representation space in MNMT model.”

BeijingPeople's Republic of ChinaAsiaEmerging TechnologiesMachine LearningMachine TranslationMinzu University of China

2024

Robotics & Machine Learning Daily News

Robotics & Machine Learning Daily News

ISSN:
年,卷(期):2024.(Feb.8)
  • 1
  • 58