首页|基于提示微调的汉语词汇简化研究

基于提示微调的汉语词汇简化研究

扫码查看
词汇简化是在不改变原句结构和语义的情况下,用更简单的词替换句子中的难词,提高文本面向特定群体读者的可读性.该文提出基于提示微调的汉语词汇简化方法PTCLS(Prompt-tuning Based Chinese Lexical Simpli-fication).PTCLS采用基于BART的底层架构,能够自然地生成不同字数的替代词,模型训练只需微调少量参数.在公开的汉语词汇简化数据集上的实验表明,该文提出的方法可以大幅超越目前最好的基线系统BERT-LS.深入分析揭示,微调方法只利用少量标注数据即可取得比全参数微调、手工提示和无监督方法更好的表现,尤其针对汉语同义词词典外的难词取得了更显著的性能提升.
Chinese Lexical Simplification Based on Prompt-Tuning
Lexical simplification is to replace complex words with simpler substitutes without changing the structure and meanings of the original sentence,improving the readability of text for a specific group of readers.In this paper,we propose a prompt tuning-based lexical simplification framework PTCLS(Prompt-tuning based Chinese Lexical Simplification).PTCLS adopts the BART architecture,which can naturally generate multi-token expressions as substitutes and incorporate supervision by tuning only a small number of parameters.The experimental results on public dataset of Chinese lexical simplification illustrate that our method obtains significant improvements compared with the current best Chinese lexical simplification system BERT-LS.In-depth analysis reveals that our proposed method is capable of using a small amount of annotation data to achieve better performance than full-parameter fine-tuning,manual prompts and unsupervised methods,especially for complex words outside the Chinese synonym dic-tionary.

lexical simplificationprompt learningprompt tuningtext simplificationfew-shot learning

肖子豪、程苗苗、巩捷甫、韩旭、王士进、宋巍

展开 >

首都师范大学信息工程学院和交叉学科研究院,北京 100056

科大讯飞股份有限公司AI研究院,安徽 合肥 230088

认知智能国家重点实验室,安徽合肥 230088

词汇简化 提示学习 提示微调 文本简化 小样本学习

国家自然科学基金国家自然科学基金国家重点研究与发展计划北京市教育委员会科技计划项目

62376166623061882022YFC3303504KM202010028004

2024

中文信息学报
中国中文信息学会,中国科学院软件研究所

中文信息学报

CSTPCDCHSSCD北大核心
影响因子:0.8
ISSN:1003-0077
年,卷(期):2024.38(8)