From Hypergraph Energy Functions to Hypergraph Neural Networks

Authors: Yuxin Wang, Quan Gan, Xipeng Qiu, Xuanjing Huang, David Wipf

ICML 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Empirically, we demonstrate state-of-the-art results on various hypergraph node classification benchmarks.
Researcher Affiliation Collaboration Yuxin Wang 1 2 Quan Gan 3 Xipeng Qiu 1 4 Xuanjing Huang 1 5 David Wipf 3 Work completed during an internship at the AWS Shanghai AI Lab. 1School of Computer Science, Fudan University 2Institute of Modern Languages and Linguistics, Fudan University 3Amazon 4Peng Cheng Laboratory 5Shanghai Collaborative Innovation Center of Intelligent Visual Computing.
Pseudocode Yes The overall algorithm for Phenom NN is demonstrated in Algorithm 1.
Open Source Code Yes Code is available at https://github.com/yxzwang/Phenom NN.
Open Datasets Yes We adopt five public citation network datasets from (Zhang et al., 2022): Coauthorship/Cora,Co-authorship/DBLP, Co-citaion/Cora, Cocitaion/Pubmed, Co-citaion/Citeseer. These datasets and splits are constructed by (Yadati et al., 2019) (https:// github.com/malllabiisc/Hyper GCN). We also adopt two other public visual object classification datasets: Princeton Model Net40 (Wu et al., 2015) and the National Taiwan University (NTU) 3D model dataset (Chen et al., 2003). [...] All datasets from (Chien et al., 2022) are downloaded from their code site (https: //github.com/jianhao2016/All Set).
Dataset Splits Yes For results in Table 2, we randomly split the data into training/validation/test samples using (50%/25%/25%) splitting percentages as in (Chien et al., 2022) and report the average accuracy over ten random splits.
Hardware Specification Yes All experiments are implemented on RTX 3090 with Pytorch and DGL (Wang et al., 2019a).
Software Dependencies No The paper mentions 'Pytorch and DGL (Wang et al., 2019a)' but does not specify their version numbers.
Experiment Setup Yes Detailed hyperparameter settings are deferred to Appendix D. Here we present hyperparameters for reproducing results in Table 1, Table 2 in Table 7 and 8 . And for Table 5 the hyperparameters are in Table 9 and 10. Note that in the ablation for combination coefficients, we re-searched for hyperparameters for each combination.