Towards Scale-Invariant Graph-related Problem Solving by Iterative Homogeneous GNNs
Authors: Hao Tang, Zhiao Huang, Jiayuan Gu, Bao-Liang Lu, Hao Su
NeurIPS 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experimentally, we show that our GNN can be trained from small-scale graphs but generalize well to large-scale graphs for a number of basic graph theory problems. It also shows generalizability for applications of multi-body physical simulation and image-based navigation problems. |
| Researcher Affiliation | Academia | Hao Tang Shanghai Jiao Tong University tanghaosjtu@gmail.com Zhiao Huang UC San Diego z2huang@eng.ucsd.edu Jiayuan Gu UC San Diego jigu@eng.ucsd.edu Bao-Liang Lu Shanghai Jiao Tong University bllu@sjtu.edu.cn Hao Su UC San Diego haosu@eng.ucsd.edu |
| Pseudocode | Yes | We present the pseudo-codes in Algorithm 1. |
| Open Source Code | No | The paper does not provide any explicit statements about releasing source code or links to a code repository. |
| Open Datasets | No | The paper states that it built a benchmark by combining multiple graph generators and that 'The generation processes and the properties of datasets are listed in the Appendix.' However, it does not provide concrete access information (e.g., URL, DOI, or a specific, named public dataset with citation) for these generated datasets. |
| Dataset Splits | Yes | We generate 10000 samples for training, 1000 samples for validation, and 1000 samples for testing. |
| Hardware Specification | No | The paper does not provide any specific details about the hardware used for running experiments, such as GPU/CPU models or memory specifications. |
| Software Dependencies | No | The paper does not provide specific version numbers for any software dependencies or libraries used in the experiments. |
| Experiment Setup | Yes | We utilize the default hyper-parameters to train models. We generate 10000 samples for training, 1000 samples for validation, and 1000 samples for testing. The only two tunable hyper-parameter in our experiment is the epoch number (10 choices) and the formulation of Path GNN layers (3 choices). Validation datasets are used to tune them. |