首页|Findings from University of Badji Mokhtar Update Understanding of Machine Translation (Low Resource Arabic Dialects Transformer Neural Machine Translation Improvement Through Incremental Transfer of Shared Linguistic Features)

Findings from University of Badji Mokhtar Update Understanding of Machine Translation (Low Resource Arabic Dialects Transformer Neural Machine Translation Improvement Through Incremental Transfer of Shared Linguistic Features)

扫码查看
New research on Machine Translation is the subject of a report. According to news originating from Annaba, Algeria, by NewsRx correspondents, research stated, "Neural machine translation (NMT) is a complex process that deals with many grammatical complexities. Today, transfer learning (TL) has emerged as a leading method in machine translation, enhancing accuracy with ample source data for limited target data." Financial supporters for this research include Direction Generale de la Recherche Scientifique et du Developpement Technologique (DGRSDT), Laboratoire de Recherche Informatique (LRI). Our news journalists obtained a quote from the research from the University of Badji Mokhtar, "Yet, low-resource languages such as Arabic dialects lack substantial source data. This study aims to enable an NMT model, trained on a sparse Arabic dialect corpus, to translate a precise dialect with a limited corpus, addressing this gap. This paper introduces an incremental transfer learning approach tailored for translating low-resource language. The method utilizes various related language corpora, employing an incremental fine-tuning strategy to transfer linguistic features from a grand-parent model to a child model. In our case, Knowledge is transferred from a broad set of Arabic dialects to the Maghrebi dialects subset and then to specific low-resource dialects such as Algerian, Tunisian, and Moroccan, employing Transformer and attentional sequence-to-sequence models. The evaluation of the proposed strategy on Algerian, Tunisian, and Moroccan dialects demonstrates superior translation performance compared to traditional TL methods. Using the Transformer model, it shows improvements of 80%, 62%, and 58% for Algerian, Tunisian, and Moroccan dialects, respectively."

AnnabaAlgeriaEmerging TechnologiesMachine LearningMachine TranslationUniversity of Badji Mokhtar

2024

Robotics & Machine Learning Daily News

Robotics & Machine Learning Daily News

ISSN:
年,卷(期):2024.(Feb.28)
  • 52