Towards Better Dynamic Graph Learning: New Architecture and Unified Library
Authors: Le Yu, Leilei Sun, Bowen Du, Weifeng Lv
NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | By performing exhaustive experiments on thirteen datasets for dynamic link prediction and dynamic node classification tasks, we find that Dy GFormer achieves state-of-the-art performance on most of the datasets, demonstrating its effectiveness in capturing nodes correlations and long-term temporal dependencies. |
| Researcher Affiliation | Academia | Le Yu, Leilei Sun , Bowen Du, Weifeng Lv State Key Laboratory of Software Development Environment School of Computer Science and Engineering Beihang University {yule,leileisun,dubowen,lwf}@buaa.edu.cn |
| Pseudocode | No | The paper describes its methods using prose and mathematical equations, but does not include any explicit pseudocode or algorithm blocks. |
| Open Source Code | Yes | All the used resources are publicly available at https://github.com/yule-BUAA/Dy GLib. |
| Open Datasets | Yes | We experiment with thirteen datasets (Wikipedia, Reddit, MOOC, Last FM, Enron, Social Evo., UCI, Flights, Can. Parl., US Legis., UN Trade, UN Vote, and Contact), which are collected by [44] and cover diverse domains. |
| Dataset Splits | Yes | For both tasks, we chronologically split each dataset with the ratio of 70%/15%/15% for training/validation/testing. |
| Hardware Specification | Yes | Experiments are conducted on an Ubuntu machine equipped with one Intel(R) Xeon(R) Gold 6130 CPU @ 2.10GHz with 16 physical cores. The GPU device is NVIDIA Tesla T4 with 15 GB memory. |
| Software Dependencies | No | The paper mentions implementation using PyTorch ('which are all implemented by Py Torch'), but does not specify its version or other software dependencies with version numbers. |
| Experiment Setup | Yes | We set the learning rate and batch size to 0.0001 and 200 for all the methods on all the datasets. |