首页|大语言模型驱动的多元关系知识图谱补全方法

大语言模型驱动的多元关系知识图谱补全方法

扫码查看
知识图谱通过将复杂的互联网信息转化为易于理解的结构化形式,极大地提高了信息的可访问性.知识图谱补全技术进一步增强了知识图谱的信息完整性,显著提升了智能问答和推荐系统等通用领域应用的性能与用户体验.然而,现有的知识图谱补全方法大多专注于关系类型较少和简单语义情景下的三元组实例,未能充分利用知识图谱在处理多元关系和复杂语义方面的潜力.针对此问题,提出了一种由大语言模型(LLM)驱动的多元关系知识图谱补全方法.将LLM的深层语言理解能力与知识图谱的结构特性相结合,有效捕捉多元关系,理解复杂语义情景.此外,还引入了一种基于思维链的提示工程策略,旨在提高补全任务的准确性.该方法在两个公开知识图谱数据集上的实验结果都取得了显著的提升.
Large Language Model Driven Multi-relational Knowledge Graph Completion Method
Knowledge graphs transform complex Internet information into an easily understandable structured format,significant-ly enhancing the accessibility of information.Knowledge graph completion(KGC)techniques further enhance the completeness of knowledge graphs,markedly improved the performance and user experience of general domain applications such as intelligent question answering and recommendation systems by enhancing the information completeness of knowledge graphs.However,most existing KGC methods focus on triple instances in scenarios with fewer types of relationships and simpler semantics,failing to fully leverage the potential of graphs in handling multi-relational and complex semantics.In response to this issue,we propose a method for multi-relational knowledge graph completion driven by large language model(LLM).By combining the deep linguis-tic understanding capabilities of LLM with the structural characteristics of knowledge graphs,this method effectively captures complex semantic scenarios and comprehends multi-relational relationships.Additionally,we introduce a chain-of-thought based prompting engineering strategy,aiming to enhancing the accuracy of the completion tasks.Experimental results on two public knowledge graph datasets have demonstrated the significant performance improvements of this method.

Knowledge graphLarge language modelKnowledge graph completionMulti-relationalCandidate set constructionChain-of-thought prompt

刘畅成、桑磊、李炜、张以文

展开 >

安徽大学计算机科学与技术学院 合肥 230601

知识图谱 大语言模型 知识图谱补全 多元关系 候选集构建 思维链提示

2025

计算机科学
重庆西南信息有限公司(原科技部西南信息中心)

计算机科学

北大核心
影响因子:0.944
ISSN:1002-137X
年,卷(期):2025.52(1)