In recent years,pre-trained language models have achieved significant success in natural language processing(NLP)tasks,particularly in name entity recognition(NER).However,existing prompt-based NER models rely on intricate design of discrete prompts and post-processing,which increases the complexity of model development.A contrastive learning method com-bining discrete and continuous prompts was proposed for NER tasks through research on existing algorithms.This approach autono-mously learns continuous entity representations through contrastive learning,streamlining the model.Experimental results demon-strate that our framework excels on large-scale datasets and achieves efficient few-shot learning in resource-constrained scenarios.This new paradigm offers a viable and effective direction for the exploration of prompt-based learning in NER tasks.
关键词
提示学习/对比学习/命名实体识别/预训练语言模型
Key words
prompt learning/contrastive learning/NER/pre-trained language models