词汇简化是在不改变原句结构和语义的情况下,用更简单的词替换句子中的难词,提高文本面向特定群体读者的可读性.该文提出基于提示微调的汉语词汇简化方法PTCLS(Prompt-tuning Based Chinese Lexical Simpli-fication).PTCLS采用基于BART的底层架构,能够自然地生成不同字数的替代词,模型训练只需微调少量参数.在公开的汉语词汇简化数据集上的实验表明,该文提出的方法可以大幅超越目前最好的基线系统BERT-LS.深入分析揭示,微调方法只利用少量标注数据即可取得比全参数微调、手工提示和无监督方法更好的表现,尤其针对汉语同义词词典外的难词取得了更显著的性能提升.
Chinese Lexical Simplification Based on Prompt-Tuning
Lexical simplification is to replace complex words with simpler substitutes without changing the structure and meanings of the original sentence,improving the readability of text for a specific group of readers.In this paper,we propose a prompt tuning-based lexical simplification framework PTCLS(Prompt-tuning based Chinese Lexical Simplification).PTCLS adopts the BART architecture,which can naturally generate multi-token expressions as substitutes and incorporate supervision by tuning only a small number of parameters.The experimental results on public dataset of Chinese lexical simplification illustrate that our method obtains significant improvements compared with the current best Chinese lexical simplification system BERT-LS.In-depth analysis reveals that our proposed method is capable of using a small amount of annotation data to achieve better performance than full-parameter fine-tuning,manual prompts and unsupervised methods,especially for complex words outside the Chinese synonym dic-tionary.