Edge Representation Learning with Hypergraphs
Authors: Jaehyeong Jo, Jinheon Baek, Seul Lee, Dongki Kim, Minki Kang, Sung Ju Hwang
NeurIPS 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We validate our edge representation learning method with hypergraphs on diverse graph datasets for graph representation and generation performance, on which our method largely outperforms existing graph representation learning methods. |
| Researcher Affiliation | Collaboration | Jaehyeong Jo1 , Jinheon Baek1 , Seul Lee1 , Dongki Kim1, Minki Kang1, Sung Ju Hwang1,2 KAIST1, AITRICS2, South Korea |
| Pseudocode | No | The paper describes methods through textual descriptions and mathematical equations but does not include structured pseudocode or algorithm blocks. |
| Open Source Code | Yes | Code is available at https://github.com/harryjo97/EHGNN |
| Open Datasets | Yes | For edge and graph reconstruction of real-world graphs, we use the ZINC dataset [19] that consists of 12K molecular graphs [7]... We use 6 datasets from the TU datasets [28]... Also, we further use the 4 molecule datasets (i.e., HIV, Tox21, Tox Cast, BBBP) from the recently released OGB datasets [17]. |
| Dataset Splits | Yes | We evaluate the accuracy of each model with 10-fold cross validation [43] on the TU datasets, and use ROC-AUC as the evaluation metric for the OGB datasets. For both datasets, we follow the standard experimental settings, from the feature extraction to the dataset splitting. |
| Hardware Specification | No | The paper does not explicitly provide details about the specific hardware (e.g., CPU, GPU models, memory) used for running the experiments. It generally refers to 'deep GNNs' but without concrete specifications. |
| Software Dependencies | No | The paper mentions using various GNN models and frameworks (e.g., GCN, GIN, GMPool, Mol GAN), but it does not specify exact version numbers for any of the software dependencies, libraries, or programming languages used. |
| Experiment Setup | Yes | For node reconstruction, we set message-passing to GCN and graph pooling to GMPool [2] for all models. We compare the proposed EHGNN framework against edge-aware GNNs, namely EGCN [17], MPNN [12], R-GCN [32], and EGNN [13]... Our Hyper Drop uses SAGPool [24] on the hypergraph, which is a node drop pooling method based on self-attention. |