A Unified Lottery Ticket Hypothesis for Graph Neural Networks
Authors: Tianlong Chen, Yongduo Sui, Xuxi Chen, Aston Zhang, Zhangyang Wang
ICML 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Our proposal has been experimentally verified across various GNN architectures and diverse tasks, on both small-scale graph datasets (Cora, Citeseer and Pub Med), and large-scale datasets from the challenging Open Graph Benchmark (OGB). |
| Researcher Affiliation | Collaboration | 1Department of Electrical and Computer Engineering, University of Texas at Austin 2University of Science and Technology of China 3AWS Deep Learning. |
| Pseudocode | Yes | Algorithm 1 Unified GNN Sparsification (UGS) |
| Open Source Code | Yes | Codes are at https://github. com/VITA-Group/Unified-LTH-GNN. |
| Open Datasets | Yes | Datasets We use popular semi-supervised graph datasets: Cora, Citeseer and Pub Med (Kipf & Welling, 2016), for both node classification and link prediction tasks. For experiments on large-scale graphs, we use the Open Graph Benchmark (OGB) (Hu et al., 2020), such as Ogbn-Ar Xiv, Ogbn Proteins, and Ogbl-Collab. |
| Dataset Splits | Yes | Other details such as the datasets trainval-test splits are included in Appendix A1. |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., GPU/CPU models, memory) used for running its experiments. |
| Software Dependencies | No | The paper mentions 'NetworkX' but does not provide specific version numbers for software dependencies or frameworks used for the experiments. |
| Experiment Setup | Yes | More detailed configurations such as learning rate, training iterations, and hyperparameters in UGS, are referred to Appendix A1. |