首页|New Machine Translation Study Findings Recently Were Reported by Researchers at University of Science and Technology China (Datastore Distillation for Nearest Neighbor Machine Translation)
New Machine Translation Study Findings Recently Were Reported by Researchers at University of Science and Technology China (Datastore Distillation for Nearest Neighbor Machine Translation)
扫码查看
点击上方二维码区域,可以放大扫码查看
原文链接
NETL
NSTL
IEEE
Data detailed on Machine Translation have been presented. According to news reporting from Anhui, People’s Republic of China, by NewsRx journalists, research stated, “Nearest neighbor machine translation (i.e., kNN-MT) is a promising approach to enhance translation quality by equipping pre-trained neural machine translation (NMT) models with the nearest neighbor retrieval. Despite its great success, kNN-MT typically requires ample space to store its token-level datastore, causing kNN-MT to be less practical in edge devices or online scenarios.” Financial support for this research came from National Natural Science Foundation of China (NSFC). The news correspondents obtained a quote from the research from the University of Science and Technology China, “In this paper, inspired by the concept of knowledge distillation, we provide a new perspective to ease the storage overhead by datastore distillation, which is formalized as a constrained optimization problem. We further design a novel model-agnostic iterative nearest neighbor merging method for the datastore distillation problem to obtain an effective and efficient solution. Experiments on three benchmark datasets indicate that our approach not only reduces the volume of the datastore by up to 50% without significant performance degradation, but also outperforms other baselines by a large margin at the same compression rate.”
AnhuiPeople’s Republic of ChinaAsiaEmerging TechnologiesMachine LearningMachine TranslationUniversity of Science and Technology China