首页|New Machine Translation Data Have Been Reported by Researchers at Beijing Institute of Technology (Alleviating Repetitive Tokens In Non-autoregressive Machine Translation With Unlikelihood Training)

New Machine Translation Data Have Been Reported by Researchers at Beijing Institute of Technology (Alleviating Repetitive Tokens In Non-autoregressive Machine Translation With Unlikelihood Training)

扫码查看
Investigators publish new report on Machine Translation. According to news originating from Beijing, People’s Republic of China, by NewsRx correspondents, research stated, “In recent years, significant progress has been made in the field of non-autoregressive machine translations. However, the accuracy of non-autoregressive models still lags behind their autoregressive counterparts.” Financial supporters for this research include National Natural Science Foundation of China (NSFC), National Natural Science Foundation of China (NSFC). Our news journalists obtained a quote from the research from the Beijing Institute of Technology, “This discrepancy can be attributed to the abundance of repetitive tokens in the target sequences generated by non-autoregressive models. In this study, we delve into this phenomenon and propose a novel approach to train a non-autoregressive model using unlikelihood loss. We evaluate our method on three widely used benchmark tasks. The experimental results demonstrating that our proposed approach significantly reduces the number of repetitive tokens while improving the overall performance of non-autoregressive machine translations.” According to the news editors, the research concluded: “Compared to the baseline model ‘MaskPredict’, the average number of repetitions on IWSLT 14 DE ->EN valid set is reduced from 0.48 to 0.17, resulting in a remarkable 62% decrease.”

BeijingPeople’s Republic of ChinaAsiaEmerging TechnologiesMachine LearningMachine TranslationBeijing Institute of Technology

2024

Robotics & Machine Learning Daily News

Robotics & Machine Learning Daily News

ISSN:
年,卷(期):2024.(Feb.9)
  • 25