In the era of large language models(LLMs),knowledge graphs(KGs),as a structured representation of knowledge,play an irreplaceable role in enhancing the reliability,security,and interpretability of artificial intelligence.With its superior per-formance in semantic understanding,pre-trained language models(PLMs)have become the main approach in knowledge graph re-search in recent years.This paper systematically reviews the research works on PLM-based knowledge graphs,including know-ledge graph construction,representation learning,reasoning,and question answering.The core ideas of the relevant models and methods are introduced and a classification system is established based on the technological approaches.A comparative analysis of the advantages and disadvantages of different categories of methods is provided.In addition,the application status of pre-trained language models in two new types of knowledge graphs,event knowledge graphs and multimodal knowledge graphs,is reviewed.Finally,the challenges faced by current research on knowledge graphs based on pre-trained language models are summarized,and future research directions are prospected.
关键词
知识图谱/预训练语言模型/大语言模型/多模态/事件知识图谱
Key words
Knowledge graph/Pre-trained language model/Large language model/Multi-modal/Event knowledge graph