首页|融合实体注意力与语义信息的关系抽取模型

融合实体注意力与语义信息的关系抽取模型

扫码查看
知识图谱通过语义网络,建立现实世界和数据世界映射,支撑了很多行业中的具体应用,实体关系抽取是知识图谱构建中的核心环节。论文针对关系抽取任务中实体相关特征利用率低、文本特征提取不充分以及部分预训练模型不能够很好提取序列特征的问题,提出一个基于BERT预训练模型,下游利用长短期记忆网络(LSTM)能够有效处理长期依赖问题的特点,再结合实体位置自感知注意力机制组合成新的模型。模型分别在两个公共数据集上测试,实验结果表明论文模型在TacRed数据集和SemEval 2020 Task 8数据集上f1得分值分别可以达到67。1%,87。8%,均优于部分先前的模型。
Relational Extraction Model Incorporating Entity Attention and Semantic Information
Knowledge Graph builds the mapping of the real world and the data world through the semantic network,which sup-ports many specific applications in the industry.Entity relationship extraction is the core link in the construction of knowledge graph.However,the automatic extraction of relational knowledge from documents to supplement the knowledge base has been slow to devel-op.Based on the low utilization of relationship between extraction task entity in the position and inadequate text feature extraction problem,this paper proposes a model of entity relationship extraction based on BERT,downstream uses both short-term and long-term memory network(LSTM)to deal effectively with the characteristics of long relied on,combining entity position self-sens-ing attention mechanism to from a new composite model.The model is tested on two common data sets respectively,and the experi-mental results show that the F1 score of the model in this paper can reach 67.1%and 87.8%on TacRed data set and SemEval 2020 Task 8 data set,respectively,which is better than some previous models.

pretraining modelsemantic relation extractionattention mechanismlong and short term memory networksnatural language processing

刘云腾

展开 >

江南大学物联网工程学院 无锡 214000

预训练模型 语义关系抽取 注意力机制 长短期记忆网络 自然语言处理

2024

计算机与数字工程
中国船舶重工集团公司第七0九研究所

计算机与数字工程

CSTPCD
影响因子:0.355
ISSN:1672-9722
年,卷(期):2024.52(2)
  • 17