Learning to Sample and Aggregate: Few-shot Reasoning over Temporal Knowledge Graphs
Authors: Ruijie Wang, Zheng Li, Dachun Sun, Shengzhong Liu, Jinning Li, Bing Yin, Tarek Abdelzaher
NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Empirically, extensive experiments on three real-world TKGs demonstrate the superiority of Meta TKGR over state-of-the-art baselines by a large margin. |
| Researcher Affiliation | Collaboration | 1University of Illinois at Urbana Champaign, IL, USA 2Amazon.com Inc, CA, USA |
| Pseudocode | Yes | Algorithm 1: Temporal Neighbor Sampler. and Algorithm 2: Meta TKGR: Meta-training. |
| Open Source Code | No | The paper does not contain an explicit statement about releasing the source code or a link to a code repository for the methodology described. |
| Open Datasets | Yes | Datasets. We evaluate the proposed Meta TKGR framework on three public TKGs, where YAGO [34] and WIKI [22] stores time-varying facts and ICEWS18 [3] is event-centric. |
| Dataset Splits | Yes | Given the temporal knowledge graph, we first split the time duration into four with a ratio of 0.4:0.25:0.1:0.25 chronologically, then we collect the entities that firstly appear in each period as background/meta-training/meta-validation/meta-test entity set. |
| Hardware Specification | No | The paper does not provide specific hardware details such as GPU/CPU models, processor types, or memory amounts used for running the experiments. |
| Software Dependencies | No | The paper does not provide specific software dependencies or library versions (e.g., 'PyTorch 1.9', 'Python 3.8') needed to replicate the experiment. |
| Experiment Setup | Yes | For fair comparison, we keep the dimension of all embeddings as 128, and utilize pre-trained 1-shot Trans E embeddings for initialization for models if applicable. We report detailed experimental setup, especially of Meta TKGR, in the Appendix A.4. |