Automorphic Equivalence-aware Graph Neural Network
Authors: Fengli Xu, Quanming Yao, Pan Hui, Yong Li
NeurIPS 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Finally, we empirically validate our model on eight real-world graph data, including social network, e-commerce co-purchase network, and citation network, and show that it consistently outperforms existing GNNs. |
| Researcher Affiliation | Collaboration | 1BNRIST & EE Department, Tsinghua University 24Paradigm Inc. 3CSE Department, HKUST. |
| Pseudocode | Yes | Algorithm 1 Genetic Algorithm for Subgraph Template Optimization |
| Open Source Code | Yes | The source code is public available at https://github.com/tsinghua-fib-lab/GRAPE. |
| Open Datasets | Yes | Citation networks [44]: We consider 2 widely used citation networks, i.e. Cora and Citeseer. Social networks [47]: We use 5 real-world social networks which are collected from the Facebook friendships within 5 universities, i.e., Hamilton, Lehigh, Rochester, Johns Hopkins (JHU) and Amherst. E-commerce co-purchase networks [26]: This dataset was collected by crawling the music items in Amazon website. |
| Dataset Splits | No | The paper states, "The hyper-parameter tuning and detailed experiment settings are discussed in Appendix D.1." However, Appendix D.1 is not provided in the given text, and therefore, specific dataset split information for training, validation, or testing is not available in the provided content. |
| Hardware Specification | No | The paper mentions "CPU" and "GPU" in the context of training limitations for some baselines, but it does not provide specific hardware models (e.g., NVIDIA A100, Intel i7) or detailed specifications used for running its experiments. |
| Software Dependencies | No | The paper mentions "Pytorch platform" but does not specify a version number or other key software components with their versions. |
| Experiment Setup | No | The paper states, "The hyper-parameter tuning and detailed experiment settings are discussed in Appendix D.1." However, Appendix D.1 is not provided in the given text, and therefore, specific hyperparameters, training configurations, or system-level settings are not available in the provided content. |