GRAFENNE: Learning on Graphs with Heterogeneous and Dynamic Feature Sets
Authors: Shubham Gupta, Sahil Manchanda, Sayan Ranu, Srikanta J. Bedathur
ICML 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Empirical evaluation: Extensive experiments on diverse real-world datasets establish that GRAFENNE consistently outperforms baseline methods across various levels of feature scarcity on both homophilic as well as heterophilic graphs. |
| Researcher Affiliation | Academia | 1Department of Computer Science and Engineering, IIT Delhi. |
| Pseudocode | No | The paper describes the message passing layers using mathematical equations and text, but does not include a formal pseudocode block or algorithm listing. |
| Open Source Code | Yes | Our codebase is available at https://github.com/data-iitd/Grafenne. |
| Open Datasets | Yes | We evaluate GRAFENNE on the real-world graphs listed in Table 1. Among these, Actor is a heterophilic graph, whereas the rest are homophilic. Further details on the semantics of the datasets are provided in App. H. Table 1: Dataset statistics Cora (Sen et al., 2008) Cite Seer (Yang et al., 2016) Physics (Shchur et al., 2018) Actor (Pei et al., 2020) |
| Dataset Splits | Yes | We perform a 60% 20% 20% data split for train-test-validation. |
| Hardware Specification | Yes | All experiments are performed on an Intel Xeon Gold 6248 processor with 80 cores, 1 Tesla V-100 GPU card with 32GB GPU memory, and 377 GB RAM with Ubuntu 18.04. |
| Software Dependencies | No | The paper specifies the operating system (Ubuntu 18.04) but does not list specific versions for other key software components, libraries, or frameworks used (e.g., Python, PyTorch, TensorFlow). |
| Experiment Setup | Yes | We have used 2 layers of message-passing and trained GRAFENNE using the Adam optimizer with a learning rate of 0.0001 and choose the model based on the best validation loss. |