FreeDyG: Frequency Enhanced Continuous-Time Dynamic Graph Model for Link Prediction
Authors: Yuxing Tian, Yiyan Qi, Fan Guo
ICLR 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Extensive experiments conducted on seven real-world continuous-time dynamic graph datasets validate the effectiveness of Free Dy G. The results consistently demonstrate that Free Dy G outperforms existing methods in both transductive and inductive settings. |
| Researcher Affiliation | Collaboration | Yuxing Tian1 Yiyan Qi1 Fan Guo2 1IDEA Research, International Digital Economy Academy 2Jiangxi Normal University |
| Pseudocode | Yes | In Algorithm 1, we show the pseudo-code of the training process of Free Dy G. In addition, following the suggestion of the reviewers, we briefly describe the procedure of FFT in Algorithm 2. |
| Open Source Code | Yes | Our code is available at this repository: https://github.com/Tianxzzz/Free Dy G |
| Open Datasets | Yes | We utilize seven publicly available real-world datasets: Wiki, REDDIT, MOOC, Last FM, Enron, Social Evo, and UCI, in our study. |
| Dataset Splits | Yes | To facilitate training, validation, and testing, we split these datasets into three chronological segments with ratios of 70%-15%-15%. |
| Hardware Specification | Yes | All experiments are performed on an NVIDIA A100-SXM4 40GB GPU. |
| Software Dependencies | No | The paper mentions "Adam optimizer" but does not provide specific version numbers for software dependencies such as Python, PyTorch, or other libraries. |
| Experiment Setup | Yes | All models are trained for a maximum of 200 epochs using the early stopping strategy with patience of 20. The model that achieves the highest performance on the validation set is selected for testing. For all models, we employ the Adam optimizer and set the learning rate and batch size to 0.0001 and 200, respectively. The hyperparameter configurations of the baselines align with those specified in their respective papers. For our Free Dy G, we set the d T to 100,and both α and β to 10. The number of frequency-enhanced MLP-Mixer layers are 2. We conduct ten runs of each method with different seeds and report the average performance to eliminate deviations. |