Descent Steps of a Relation-Aware Energy Produce Heterogeneous Graph Neural Networks
Authors: Hongjoon Ahn, Yongyi Yang, Quan Gan, Taesup Moon, David P Wipf
NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experimental results on 8 heterogeneous graph benchmarks demonstrate that our proposed method can achieve competitive node classification accuracy. |
| Researcher Affiliation | Collaboration | Hongjoon Ahn1 , Yongyi Yang2 , Quan Gan 3, Taesup Moon 1 and David Wipf 3 1 ECE/IPAI/ASRI/INMC, Seoul National University, 2 University of Michigan, 3 Amazon Web Services |
| Pseudocode | Yes | Algorithm 1 HALO algorithm |
| Open Source Code | Yes | The source code of our algorithm is available at https://github.com/hongjoon0805/HALO. |
| Open Datasets | Yes | HGB contains 4 node classification datasets, which is our focus herein. These include: DBLP, IMDB, ACM, and Freebase. The knowledge graph benchmarking proposed in [29] is composed of 4 datasets: AIFB, MUTAG, BGS, AM. |
| Dataset Splits | No | For the details on the hyperparameters and other experimental settings, please see the Supplementary Materials. |
| Hardware Specification | No | The paper states: “Did you include the total amount of compute and the type of resources used (e.g., type of GPUs, internal cluster, or cloud provider)? [Yes] Please see Supplementary Materials.” This information is not provided in the main paper. |
| Software Dependencies | No | All models and experiments were implemented using Py Torch [25] and the Deep Graph Library (DGL) [32]. |
| Experiment Setup | No | For the details on the hyperparameters and other experimental settings, please see the Supplementary Materials. |