Unleashing the Power of Graph Data Augmentation on Covariate Distribution Shift
Authors: Yongduo Sui, Qitian Wu, Jiancan Wu, Qing Cui, Longfei Li, Jun Zhou, Xiang Wang, Xiangnan He
NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Extensive experiments with in-depth empirical analysis demonstrate the superiority of our approach. The implementation codes are publicly available at https://github.com/yongduosui/AIA. In this section, we conduct extensive experiments to answer the following Research Questions: ... We first make comparisons with various baselines in Table 1, and have the following observations: ... We also conduct extensive experiments and in-depth analyses. |
| Researcher Affiliation | Collaboration | 1University of Science and Technology of China, 2Shanghai Jiao Tong University, 3Ant Group |
| Pseudocode | Yes | Algorithm 1: Estimation of Graph Covariate shift; Algorithm 2: Adversarial Invariant Augmentation |
| Open Source Code | Yes | The implementation codes are publicly available at https://github.com/yongduosui/AIA. |
| Open Datasets | Yes | We use graph OOD datasets [2] and OGB datasets [20], which include Motif, CMNIST, Molbbbp, and Molhiv. |
| Dataset Splits | Yes | For size covariate shift, we use small-size of graphs for training, while the validation and the test sets include the middle- and the large-size graphs, respectively. ... For scaffold shift, we follow [2] and use scaffold split to create training, validation and test sets. |
| Hardware Specification | Yes | We use the NVIDIA Ge Force RTX 3090 (24GB GPU) to conduct all our experiments. |
| Software Dependencies | No | The paper mentions using GIN as the backbone and discusses some general software aspects but does not provide specific version numbers for software dependencies (e.g., Python, PyTorch, or other libraries). |
| Experiment Setup | Yes | We tune the hyper-parameters in the following ranges: α and β {0.01,0.005,0.001}; λ2 {0.1,...,0.9}; γ {0.01,0.1,0.2,0.5,1.0,1.5,2.0,3.0,5.0}. The hyper-parameters are summarized in Table 5. |