Knowledge graph completion by integrating textual information and graph structure information
Based upon path query information,we propose a graph attention model that effectively integrates textual and graph structure information in knowledge graphs,thereby enhancing knowledge graph completion.For textual information,a dual-encoder based on pre-trained language models is utilized to separately obtain embedding representations of entities and path query information.Additionally,an attention mechanism is employed to aggregate path query information,which is used to capture graph structural information and update entity embeddings.The model was trained using contrastive learning and experiments were conducted on multiple knowledge graph datasets,with good results achieved in both transductive and inductive settings.These results demonstrate the advantage of combining pre-trained language models with graph neural networks to effectively capture both textual and graph structural information,thereby enhancing knowledge graph completion.
knowledge graph completionpre-trained language modelcontrastive learninggraph neural networks