Relational Extraction Model Incorporating Entity Attention and Semantic Information
Knowledge Graph builds the mapping of the real world and the data world through the semantic network,which sup-ports many specific applications in the industry.Entity relationship extraction is the core link in the construction of knowledge graph.However,the automatic extraction of relational knowledge from documents to supplement the knowledge base has been slow to devel-op.Based on the low utilization of relationship between extraction task entity in the position and inadequate text feature extraction problem,this paper proposes a model of entity relationship extraction based on BERT,downstream uses both short-term and long-term memory network(LSTM)to deal effectively with the characteristics of long relied on,combining entity position self-sens-ing attention mechanism to from a new composite model.The model is tested on two common data sets respectively,and the experi-mental results show that the F1 score of the model in this paper can reach 67.1%and 87.8%on TacRed data set and SemEval 2020 Task 8 data set,respectively,which is better than some previous models.
pretraining modelsemantic relation extractionattention mechanismlong and short term memory networksnatural language processing