首页|基于提示学习的小样本命名实体识别

基于提示学习的小样本命名实体识别

扫码查看
近年来,预训练语言模型在自然语言处理(NLP)任务,特别是实体识别(NER)中取得了显著成效.然而,现有的基于提示学习的NER模型依赖于复杂的离散提示设计和后处理,增加了模型开发的复杂性.通过对现有算法的研究,提出了一种结合离散和连续提示的对比学习方法,用于NER任务.该方法通过对比学习自动学习实体的连续表示,简化了模型.实验结果显示,该框架在大规模数据集上表现出色,并在资源有限的情况下实现了高效的小样本学习.这一新范式为NER任务的提示学习提供了新的方向.
Few-shot named entity recognition based on prompt learning
In recent years,pre-trained language models have achieved significant success in natural language processing(NLP)tasks,particularly in name entity recognition(NER).However,existing prompt-based NER models rely on intricate design of discrete prompts and post-processing,which increases the complexity of model development.A contrastive learning method com-bining discrete and continuous prompts was proposed for NER tasks through research on existing algorithms.This approach autono-mously learns continuous entity representations through contrastive learning,streamlining the model.Experimental results demon-strate that our framework excels on large-scale datasets and achieves efficient few-shot learning in resource-constrained scenarios.This new paradigm offers a viable and effective direction for the exploration of prompt-based learning in NER tasks.

prompt learningcontrastive learningNERpre-trained language models

陈妍、辛逍、肖晓丹

展开 >

南方电网集团深圳供电局有限公司,深圳 518046

提示学习 对比学习 命名实体识别 预训练语言模型

2024

现代计算机
中大控股

现代计算机

影响因子:0.292
ISSN:1007-1423
年,卷(期):2024.30(17)