兰州工业学院学报2024,Vol.31Issue(4) :54-59.

融入注意力机制的双通道神经网络命名实体识别

Dual Channel Neural Network Named Entity Recognition Integrated with Attention Mechanism

陶露
兰州工业学院学报2024,Vol.31Issue(4) :54-59.

融入注意力机制的双通道神经网络命名实体识别

Dual Channel Neural Network Named Entity Recognition Integrated with Attention Mechanism

陶露1
扫码查看

作者信息

  • 1. 皖江工学院 电气信息工程学院,安徽 马鞍山 243000
  • 折叠

摘要

针对深度学习方法识别命名实体缺乏丰富语义信息及冗余信息对命名实体识别的影响问题,提出一种融入注意力机制的双通道神经网络命名实体识别模型(BW-ATT-NERM).首先使用Word2vec和BERT两种语言模型将文本转换成相应的向量表示形式作为模型输入;然后采用BiGRU网络提取文本特征向量,文本特征向量利用注意力机制生成特征向量的加权语义表示;最后利用CRF训练和学习文本特征向量与输出标签之间的关系,预测和输出最佳标签序列.实验结果表明:BW-ATT-NERM模型平均准确率、平均召回率、平均F1 值达到 95.97%,94.26%,95.11%,与基准识别模型(LSTM-CRF)相比,识别效果更加明显.

Abstract

Aiming at the problem of the lack of rich semantic information and the influence of redundant informa-tion on named entity recognition by deep learning methods,a dual channel neural network named entity recogni-tion model(BW-ATT-NERM)incorporating attention mechanism is proposed.Firstly,use Word2vec and BERT language models to convert text into corresponding vector representations as model inputs;then,the BiGRU net-work is used to extract text feature vectors,which generate weighted semantic representations of feature vectors u-sing attention mechanisms;finally,using CRF to train and learn the relationship between text feature vectors and output labels,predict and output the optimal label sequence.The experimental results show that the average accu-racy,average recall,and average F1 value of the BW-ATT-NERM model reach 95.97%,94.26%,and 95.11%,respectively.Compared with the benchmark recognition model(LSTM-CRF),the recognition effect is more signif-icant.

关键词

命名实体识别/双通道/双向GRU/注意力机制

Key words

named entity recognition/dual channel/bidirectional GRU/attention mechanism

引用本文复制引用

出版年

2024
兰州工业学院学报
兰州工业学院

兰州工业学院学报

影响因子:0.205
ISSN:1009-2269
参考文献量5
段落导航相关论文