Finding Global Homophily in Graph Neural Networks When Meeting Heterophily
Authors: Xiang Li, Renyu Zhu, Yao Cheng, Caihua Shan, Siqiang Luo, Dongsheng Li, Weining Qian
ICML 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We conduct extensive experiments to compare our models against 11 other competitors on 15 benchmark datasets in a wide range of domains, scales and graph heterophilies. Experimental results show that our methods achieve superior performance and are also very efficient. |
| Researcher Affiliation | Collaboration | 1School of Data Science and Engineering, East China Normal University, Shanghai, China 2Microsoft Research Asia, Shanghai, China 3School of Computer Science and Engineering, Nanyang Technological University, Singapore. |
| Pseudocode | Yes | Finally, we summarize the pseudocodes of Glo GNN in Algorithm 1 (Section A of the appendix). |
| Open Source Code | Yes | We provide our code and data at https://github.com/RecklessRonan/GloGNN. |
| Open Datasets | Yes | For fairness, we conduct experiments on 15 benchmark datasets, which include 9 small-scale datasets released by (Pei et al., 2020) and 6 large-scale datasets from (Lim et al., 2021). We use the same training/validation/test splits as provided by the original papers. |
| Dataset Splits | Yes | We use the same training/validation/test splits as provided by the original papers. |
| Hardware Specification | Yes | Meanwhile, we run the experiments of 6 large-scale datasets on a single Tesla V100 GPU with 32G memory and use Adam W as the optimizer following (Lim et al., 2021). |
| Software Dependencies | No | The paper mentions "We implemented Glo GNN by Py Torch." but does not provide a specific version number for PyTorch or any other software dependencies. |
| Experiment Setup | Yes | We perform a grid search to tune hyper-parameters based on the results on the validation set. Details of these hyper-parameters are listed in Tables 3 and 4. |