MGNNI: Multiscale Graph Neural Networks with Implicit Layers
Authors: Juncheng Liu, Bryan Hooi, Kenji Kawaguchi, Xiaokui Xiao
NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We conduct comprehensive experiments for both node classification and graph classification to show that MGNNI outperforms representative baselines and has a better ability for multiscale modeling and capturing of long-range dependencies. |
| Researcher Affiliation | Academia | Juncheng Liu Bryan Hooi Kenji Kawaguchi Xiaokui Xiao National University of Singapore {juncheng,bhooi,kenji,xiaoxk}@comp.nus.edu.sg |
| Pseudocode | No | The paper describes algorithms using mathematical equations and textual explanations but does not include structured pseudocode or algorithm blocks. |
| Open Source Code | Yes | The implementation can be found at https://github.com/liu-jc/MGNNI |
| Open Datasets | Yes | We use the standard train/test/val splits as in Pei et al. [23]. See the detailed setting in Appendix C.2. ... The same train/val/test split are used as in Hamilton et al. [14]. |
| Dataset Splits | Yes | We use the standard train/test/val splits as in Pei et al. [23]. ... 10-fold Cross-validation is conducted as [31] |
| Hardware Specification | No | The main body of the paper does not specify the hardware used (e.g., GPU models, CPU types, or cloud instances). While Appendix C is mentioned for resource details in the checklist, it is not provided in the given text. |
| Software Dependencies | No | The paper mentions 'Py Torch [22]' without a specific version number and does not list other software dependencies with their versions. |
| Experiment Setup | No | The main paper refers to Appendix C for detailed experimental settings ('See the detailed setting in Appendix C.2.'), indicating that specific hyperparameters and training configurations are not included in the provided text. |