Knowledge Graph Completion with Knowledge Enhancement and Contrastive Learning
Knowledge Graph Completion(KGC)is an important means of improving the quality of KGs.Existing methods for KGC are mainly divided into structure-and description-based methods.The structure-based methods perform poorly in the inference of long-tailed entities commonly found in KGs,and the description-based methods are insufficient for descriptive information utilization and negative sample information learning.To address these challenges,this paper proposes a KGC method with knowledge enhancement and contrastive learning,named KEKGC.A specific template is designed to convert triples and their descriptive information into coherent natural language statements.These statements serve as input to a Pre-trained Language Model(PLM)through a manually defined template,enhancing the language model's comprehension of the structural and descriptive knowledge of each triple.On this basis,a contrastive learning framework is incorporated to improve the efficiency and accuracy of the link prediction task.This is done by building a memory bank to store entity embedding vectors,from which positive and negative samples are selected and trained with InfoNCE loss.Experimental results on WN18RR dataset show that compared with MEM-KGC,in the link prediction task,KEKGC improves Mean Reciprocal Rank(MRR)by 5.5,improves Hits@1,Hits@3,and Hits@10 metrics by 2.8,0.7,and 4.2 percentage points,respectively,and the accuracy of the triple classification task reaches 94.1%.Hence,this method can achieve higher prediction accuracy and better generalization ability,especially for long-tailed entities,which can effectively improve graph completion.
Knowledge Graph(KG)Pre-trained Language Model(PLM)link predictioncontrastive learningentity description