基于预训练语言模型的汉语古现翻译模型构建
Construction of Chinese Classical-Modern Translation Model Based on Pre-trained Language Model
吴梦成 1刘畅 1孟凯 2王东波1
作者信息
- 1. 南京农业大学信息管理学院,南京,210095;人文与社会计算江苏省高校哲学社会科学重点研究基地,南京,210095;南京农业大学领域知识关联研究中心,南京,210095
- 2. 南京农业大学马克思主义学院,南京,210095
- 折叠
摘要
本研究旨在构建并验证一种基于预训练语言模型的汉语古现翻译模型,为我国古汉语研究及文化遗产传承与传播提供强有力的技术支撑.研究选取了总计30万组精加工的《二十四史》平行语料作为实验数据集,并据此开发了一种新的翻译模型——Siku-Trans,该模型创新性地结合了专门为古汉语翻译设计的Siku-RoBERTa(作为编码器)和Siku-GPT(作为解码器),构建了一个高效的encoder-decoder架构;为全面评估Siku-Trans模型的性能,研究引入 OpenNMT、SikuGPT、SikuBERT_UNILM三种模型作为对照组,通过对比分析各模型在古汉语翻译任务上的表现发现,Siku-Trans在翻译准确性及流畅度方面均展现出显著优势.这一成果不仅凸显了将Siku-RoBERTa与Siku-GPT结合作为训练策略的有效性,也为古汉语翻译领域的深入研究与实际应用提供了重要参考和启示.
Abstract
This study aims to construct and validate a Chinese ancient-modern translation model based on pre-trained language models,providing strong technical support for the research of ancient Chinese and the inheritance and dissemination of cultural heritage.The study selected a total of 300,000 pairs of meticulously processed parallel corpora from the"Twenty-Four Histories"as the experimental dataset and developed a new translation model—Siku-Trans.This model innovatively combines Siku-RoBERTa(as the encoder)and Siku-GPT(as the decoder),designed specifically for translating ancient Chinese,to build an efficient en-coder-decoder architecture.To comprehensively evaluate the performance of the Siku-Trans model,the study introduced three models as control groups:OpenNMT,SikuGPT,and SikuBERT_UNILM.Through comparative analysis of the performance of each model in ancient Chinese translation tasks,we found that Siku-Trans exhibits significant advantages in terms of translation accuracy and fluency.These results not only highlight the effectiveness of combining Siku-RoBERTa with Siku-GPT as a training strategy but also provide important references and insights for in-depth research and practical applications in the field of ancient Chi-nese translation.
关键词
语言模型/机器翻译/古汉语翻译/Siku-RoBERTa/Siku-GPT/Siku-TransKey words
Language model/Machine translation/Ancient Chinese translation/Siku-RoBERTa/Siku-GPT/Siku-Trans引用本文复制引用
出版年
2024