首页|Thammasat University Researcher Discusses Research in Machine Translation (A Stu dy for Enhancing Low-resource Thai-Myanmar-English Neural Machine Translation)
Thammasat University Researcher Discusses Research in Machine Translation (A Stu dy for Enhancing Low-resource Thai-Myanmar-English Neural Machine Translation)
扫码查看
点击上方二维码区域,可以放大扫码查看
原文链接
NETL
NSTL
By a News Reporter-Staff News Editor at Robotics & Machine Learning Daily News Daily News-Investigators publish new report on ma chine translation. According to news reporting out of Thammasat University by Ne wsRx editors, research stated, "Several methodologies have recently been propose d to enhance the performance of low-resource Neural Machine Translation (NMT)."Our news reporters obtained a quote from the research from Thammasat University: "However, these techniques have yet to be explored thoroughly in low-resource T hai and Myanmar languages. Therefore, we first applied augmentation techniques s uch as SwitchOut and Ciphertext Based Data Augmentation (CipherDAug) to improve NMT performance in these languages. We secondly enhanced the NMT performance by fine-tuning the pre-trained Multilingual Denoising BART model (mBART), where BAR T denotes Bidirectional and Auto-Regressive Transformer. We implemented three NM T systems: namely, Transformer+SwitchOut, Multi-source Transformer+CipherDAug, a nd fine-tuned mBART in the bidirectional translations of Thai-English-Myanmar la nguage pairs from the ASEAN-MT corpus. Experimental results showed that Multi-so urce Transformer+CipherDAug significantly improved BLEU, ChrF, and TER scores ov er the first baseline Transformer and second baseline Edit-Based Transformer (ED ITOR). The model achieved notable BLEU scores: 37.9 (English-to-Thai), 42.7 (Tha i-to-English), 28.9 (English-to- Myanmar), 31.2 (Myanmar-to-English), 25.3 (Thai- to-Myanmar), and 25.5 (Myanmar-to-Thai)."