Knowledge graphs transform complex Internet information into an easily understandable structured format,significant-ly enhancing the accessibility of information.Knowledge graph completion(KGC)techniques further enhance the completeness of knowledge graphs,markedly improved the performance and user experience of general domain applications such as intelligent question answering and recommendation systems by enhancing the information completeness of knowledge graphs.However,most existing KGC methods focus on triple instances in scenarios with fewer types of relationships and simpler semantics,failing to fully leverage the potential of graphs in handling multi-relational and complex semantics.In response to this issue,we propose a method for multi-relational knowledge graph completion driven by large language model(LLM).By combining the deep linguis-tic understanding capabilities of LLM with the structural characteristics of knowledge graphs,this method effectively captures complex semantic scenarios and comprehends multi-relational relationships.Additionally,we introduce a chain-of-thought based prompting engineering strategy,aiming to enhancing the accuracy of the completion tasks.Experimental results on two public knowledge graph datasets have demonstrated the significant performance improvements of this method.
关键词
知识图谱/大语言模型/知识图谱补全/多元关系/候选集构建/思维链提示
Key words
Knowledge graph/Large language model/Knowledge graph completion/Multi-relational/Candidate set construction/Chain-of-thought prompt