Joint Entity Relation Extraction Based on Gated Convolutional Neural Networks and Self-attention Networks
Entity relation extraction is an important task in the field of natural language processing.Its goal is to identify the target relationships in text,thus providing structured data for downstream tasks such as knowledge graphs.In recent years,it has gained widespread attention continuous innovation.The performance of current entity relation extraction methods has significantly improved,such as the method based on potential relations and global correspondences(PRGC),which effectively addresses the redundancy in relation identification by introducing a relation judgment module.However,this method still faces challenges,including insufficient rich-ness of word feature information and limited model generalization capability.Referencing PRGC as a baseline,a joint entity relation extraction method based on Gated Convolutional Networks(GCN)and Self-Attention Networks(EREGS)is proposed in this paper.During the encoding phase,GCN is combined to effectively cap-ture long-distance entity features and learn more abstract feature representations,enabling the model to better understand the semantic information of the text and thereby enhancing feature extraction capabilities and cross-domain generalization.In the decoder section,a self-attention neural network is utilized to assist the model in accurately capturing the correlations between entities,thus improving the accuracy of relation discrimination.Experimental results demonstrate that the constructed model achieves F1 scores of 93.7% and 90.8% on the NYT and WEBNLG general datasets,respectively,surpassing the baseline models for joint entity relation extraction.Additionally,experiments were conducted on a self-built glioma medical dataset(GMD),indicat-ing that the model also exhibits superior performance and generalization ability in the medical domain.