An Entity-Relation Extraction Model Fusing Self-Attention Mechanism and Entity Type Knowledge
Entity-relation extraction from unstructured text has become a key task in natural language pro-cessing.At present,the mainstream methods adopt jointly extraction,which can automatically capture the dependent knowledge between entity and relation in the trainning process,and improve the extraction effect of entity and relation.However,these methods ignore the type knowledge of entiies,which leads to a lot of redundent calculations and icorrect results.To this end,we present a joint entity-relation extrac-tion method that integrates self-attention mechanisem and entity type knowledge.Firstly,the pretrained model BERT is used as the encoder to get the vector representation of each character in the sentence,and then the final semantic representation is obtained through the bidirectional LSTM layer processing.Sec-ondly,the head and tail entities are identified based on the results of the encoder layer.Then,the se-mantic representation of different head entities is iteratively integrated into the sentence representation to realize the potential semantic relation detection under the constraints of the type of head entities.Finally,input the head entity and relation respectively into the self-attention module to identify the corresponding tail entity and get the entity-relation triples.Experimental results on public datasets of NYT and WebNLG show that the F1 value of our proposed model in the entity-relation joint extraction task achiveves 93.2%and 93.3%,which is significantly improved compared with the current mainstream models.