首页|New Machine Translation Study Results Reported from University of Manchester (Ne ural machine translation of clinical text: an empirical investigation into multi lingual pre-trained language models and transfer-learning)

New Machine Translation Study Results Reported from University of Manchester (Ne ural machine translation of clinical text: an empirical investigation into multi lingual pre-trained language models and transfer-learning)

扫码查看
By a News Reporter-Staff News Editor at Robotics & Machine Learning Daily News Daily News-Fresh data on machine translation are presented in a new report. According to news reporting from the University of Ma nchester by NewsRx journalists, research stated, "Clinical text and documents co ntain very rich information and knowledge in healthcare, and their processing us ing state-of-the-art language technology becomes very important for building int elligent systems for supporting healthcare and social good. This processing incl udes creating language understanding models and translating resources into other natural languages to share domain-specific cross-lingual knowledge." Financial supporters for this research include Nuffield Foundation; Ukri/epsrc. The news correspondents obtained a quote from the research from University of Ma nchester: "In this work, we conduct investigations on clinical text machine tran slation by examining multilingual neural network models using deep learning such as Transformer based structures. Furthermore, to address the language resource imbalance issue, we also carry out experiments using a transfer learning methodo logy based on massive multilingual pre-trained language models (MMPLMs). The exp erimental results on three sub-tasks including (1) clinical case (CC), (2) clini cal terminology (CT), and (3) ontological concept (OC) show that our models achi eved top-level performances in the ClinSpEn-2022 shared task on English-Spanish clinical domain data. Furthermore, our expert-based human evaluations demonstrat e that the small-sized pre-trained language model (PLM) outperformed the other t wo extra-large language models by a large margin in the clinical domain fine-tun ing, which finding was never reported in the field. Finally, the transfer learni ng method works well in our experimental setting using the WMT21fb model to acco mmodate a new language space Spanish that was not seen at the pre-training stage within WMT21fb itself, which deserves more exploitation for clinical knowledge transformation, e.g. to investigate into more languages. These research findings can shed some light on domain-specific machine translation development, especia lly in clinical and healthcare fields."

University of ManchesterEmerging Techn ologiesMachine LearningMachine Translation

2024

Robotics & Machine Learning Daily News

Robotics & Machine Learning Daily News

ISSN:
年,卷(期):2024.(Mar.8)