Dynamic Graph Neural Networks Under Spatio-Temporal Distribution Shift
Authors: Zeyang Zhang, Xin Wang, Ziwei Zhang, Haoyang Li, Zhou Qin, Wenwu Zhu
NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experiments on three real-world datasets and one synthetic dataset demonstrate the superiority of our method over state-of-the-art baselines under distribution shifts. |
| Researcher Affiliation | Collaboration | 1Tsinghua University, 2Alibaba Group |
| Pseudocode | Yes | Algorithm 1 Training pipeline for DIDA |
| Open Source Code | Yes | 3Our codes are publicly available at https://github.com/wondergo2017/DIDA |
| Open Datasets | Yes | COLLAB [51]4 is an academic collaboration dataset... https://www.aminer.cn/collaboration. Yelp [43]5 is a business review dataset... https://www.yelp.com/dataset |
| Dataset Splits | Yes | To measure models performance under spatio-temporal distribution shift, we choose one field as w/ DS and the left others are further split into training, validation and test data ( w/o DS ) chronologically. |
| Hardware Specification | No | The paper discusses computational complexity but does not provide specific details on the hardware used for experiments, such as GPU or CPU models. |
| Software Dependencies | No | The paper does not provide specific version numbers for software dependencies or libraries used in the implementation. |
| Experiment Setup | No | The paper states 'More Details of the settings and other results can be found in Appendix' but does not include specific hyperparameters such as learning rate, batch size, or optimizer settings within the main text. |