Narrative-Driven Large Language Model for Temporal Knowledge Graph Prediction
The temporal knowledge graph(TKG)is characterized by vast sparsity,and the long-tail distribution of entities leads to poor generalization in reasoning for out-of-distribution entities.Additionally,the low infrequency of historical interactions results in biased predictions for future events.Therefore,a narrative-driven large language model for TKG Prediction is proposed.The world knowledge and complex semantic reasoning capabilities of large language models are leveraged to enhance the understanding of out-of-distribution entities and the association of sparse interaction events.Firstly,a key event tree is selected based on the temporal and structural characteristics of TKG,and the most representative events are extracted through a historical event filtering strategy.Relevant historical information is summarized to reduce input data while the most important information is retained.Then,the large language model generator is fine-tuned to produce logically coherent"key event tree"narratives as unstructured input.During the generation process,special attention is paid to the causal relationships and temporal sequences of events to ensure the coherence and rationality of the generated stories.Finally,the large language model is utilized as a reasoner to infer the missing temporal entities.Experiments on three public datasets demonstrate that the proposed method effectively leverages the capabilities of large models to achieve more accurate temporal entity reasoning.
Temporal Knowledge Graph(TKG)Large Language ModelKey Event TreeTemporal StoryEvent Inference