Entity relation extraction is one of the key task of knowledge graph construction in natural language processing.It helps to update and expand the knowledge graph automatically,and provides important knowledge base support for downstream tasks.At present,most entity relationship extraction methods extract features from a single perspective,resulting in insufficient feature expression ability.Meanwhile,the accumulation of cascading errors is severe,making it difficult to adapt well to the phenomenon of overlapping and nested entity relationships,greatly affecting the accuracy and efficiency of entity relationship extraction.To solve these problems at the same time,we propose a new joint entity relation extraction method that combines semantic and dependency syntactic information.First,pre-trained language model BERT is used to extract semantic features.Then,syntactic attention graph convolutional network is used to obtain syntactic features of fusion dependency information.Finally,dependency syntactic features and semantic features are combined to predict the position of subject and object entities in multiple relationships in a sentence.Experimental results show that the Fl value of the proposed model on NYT and WebNLG public data sets reaches 92.8%and 91.1%respectively.Compared with the baseline model and other deep learning models,the proposed model achieves better results in overlapping entity extraction,which verifies its effectiveness.