首页|Beijing Institute of Technology Researcher Details Findings in Machine Translati on (Learning Domain Specific Sub-layer Latent Variable for Multi-Domain Adaptati on Neural Machine Translation)

Beijing Institute of Technology Researcher Details Findings in Machine Translati on (Learning Domain Specific Sub-layer Latent Variable for Multi-Domain Adaptati on Neural Machine Translation)

扫码查看
By a News Reporter-Staff News Editor at Robotics & Machine Learning Daily News Daily News - Investigators publish new report on ma chine translation. According to news reporting originating from Beijing, People’ s Republic of China, by NewsRx correspondents, research stated, “Domain adaptati on proves to be an effective solution for addressing inadequate translation perf ormance within specific domains.” The news reporters obtained a quote from the research from Beijing Institute of Technology: “However, the straightforward approach of mixing data from multiple domains to obtain the multi-domain neural machine translation (NMT) model can gi ve rise to the parameter interference between domains problem, resulting in a de gradation of overall performance. To address this, we introduce a multi-domain a daptive NMT method aimed at learning domain specific sub-layer latent variable a nd employ the Gumbel-Softmax reparameterization technique to concurrently train both model parameters and domain specific sub-layer latent variable.”

Beijing Institute of TechnologyBeijingPeople’s Republic of ChinaAsiaEmerging TechnologiesMachine LearningMac hine Translation

2024

Robotics & Machine Learning Daily News

Robotics & Machine Learning Daily News

ISSN:
年,卷(期):2024.(MAY.13)