现代计算机2024,Vol.30Issue(17) :49-54.DOI:10.3969/j.issn.1007-1423.2024.17.009

基于提示学习的小样本命名实体识别

Few-shot named entity recognition based on prompt learning

陈妍 辛逍 肖晓丹
现代计算机2024,Vol.30Issue(17) :49-54.DOI:10.3969/j.issn.1007-1423.2024.17.009

基于提示学习的小样本命名实体识别

Few-shot named entity recognition based on prompt learning

陈妍 1辛逍 1肖晓丹1
扫码查看

作者信息

  • 1. 南方电网集团深圳供电局有限公司,深圳 518046
  • 折叠

摘要

近年来,预训练语言模型在自然语言处理(NLP)任务,特别是实体识别(NER)中取得了显著成效.然而,现有的基于提示学习的NER模型依赖于复杂的离散提示设计和后处理,增加了模型开发的复杂性.通过对现有算法的研究,提出了一种结合离散和连续提示的对比学习方法,用于NER任务.该方法通过对比学习自动学习实体的连续表示,简化了模型.实验结果显示,该框架在大规模数据集上表现出色,并在资源有限的情况下实现了高效的小样本学习.这一新范式为NER任务的提示学习提供了新的方向.

Abstract

In recent years,pre-trained language models have achieved significant success in natural language processing(NLP)tasks,particularly in name entity recognition(NER).However,existing prompt-based NER models rely on intricate design of discrete prompts and post-processing,which increases the complexity of model development.A contrastive learning method com-bining discrete and continuous prompts was proposed for NER tasks through research on existing algorithms.This approach autono-mously learns continuous entity representations through contrastive learning,streamlining the model.Experimental results demon-strate that our framework excels on large-scale datasets and achieves efficient few-shot learning in resource-constrained scenarios.This new paradigm offers a viable and effective direction for the exploration of prompt-based learning in NER tasks.

关键词

提示学习/对比学习/命名实体识别/预训练语言模型

Key words

prompt learning/contrastive learning/NER/pre-trained language models

引用本文复制引用

出版年

2024
现代计算机
中大控股

现代计算机

影响因子:0.292
ISSN:1007-1423
段落导航相关论文