Large Language Models-guided Dynamic Adaptation for Temporal Knowledge Graph Reasoning
Authors: Jiapu Wang, Sun Kai, LINHAO LUO, Wei Wei, Yongli Hu, Alan Wee-Chung Liew, Shirui Pan, Baocai Yin
NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experimental results show that without the need of fine-tuning, LLM-DA significantly improves the accuracy of reasoning over several common datasets, providing a robust framework for TKGR tasks3. |
| Researcher Affiliation | Academia | Jiapu Wang1, Kai Sun1 , Linhao Luo2, Wei Wei3, Yongli Hu1 , Alan Wee-Chung Liew4, Shirui Pan4 , Baocai Yin1 1Beijing University of Technology, China, 2Monash University, Australia 3University of Hong Kong, China, 4Griffith University, Australia |
| Pseudocode | No | The paper describes methods in text and uses flow diagrams but does not contain a formal "Pseudocode" or "Algorithm" block. |
| Open Source Code | Yes | Code and data are available at: https://github.com/jiapuwang/LLM-DA.git |
| Open Datasets | Yes | Datasets. ICEWS14 [52] and ICEWS05-15 [52] are the subset of Integrated Crisis Early Warning System (ICEWS)... |
| Dataset Splits | Yes | Specifically, the historical data, current data and future data correspond to the training, validation, and test datasets of prior research [46, 22]. |
| Hardware Specification | Yes | All experiments are implemented on a NVIDIA RTX 3090 GPU with i9-10900X CPU. |
| Software Dependencies | No | The paper mentions specific LLM models (Chat GPT4, Llama-2-7b-Co H, Vicuna-7b-Co H, Mixtral-8x7B-Co H, GPT-Neo X) and pre-trained models (Sentence-Bert), but does not provide specific version numbers for these or other software libraries/dependencies. |
| Experiment Setup | Yes | Additionally, LLM-DA sets the decay rate λ in Temporal Logical Rules Sampling and Candidate Generation, the threshold θ in Dynamic Adaptation, the min-confidence γ and the parameter α in Candidate Generation on both datasets as follows: λ = 0.1, θ = 0.01, α = 0.9 and γ = 0.01, except for α = 0.8 on ICEWS05-15. The number of iterations for the Dynamic Adaptation is set as 5. |