Subgraph Neural Networks
Authors: Emily Alsentzer, Samuel Finlayson, Michelle Li, Marinka Zitnik
NeurIPS 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Empirical results for subgraph classification on eight datasets show that SUBGNN achieves considerable performance gains, outperforming strong baseline methods, including node-level and graph-level methods, by 19.8% over the strongest baseline. |
| Researcher Affiliation | Academia | Emily Alsentzer Harvard University, MIT emilya@mit.edu Samuel G. Finlayson Harvard University, MIT sgfin@mit.edu Michelle M. Li Harvard University michelleli@g.harvard.edu Marinka Zitnik Harvard University marinka@hms.harvard.edu |
| Pseudocode | Yes | A full description of SUBGNN is detailed in Appendix A. (referring to Algorithm 1 in Appendix A.1) |
| Open Source Code | Yes | Code and datasets are available at https://github.com/mims-harvard/SubGNN. |
| Open Datasets | Yes | Code and datasets are available at https://github.com/mims-harvard/SubGNN. |
| Dataset Splits | No | The paper mentions using 'validation loss' for early stopping in Appendix D, implying the existence of a validation set, but it does not specify the exact percentages or counts for train/validation/test splits. |
| Hardware Specification | Yes | With pre-computation, training SUB-GNN on the real-world dataset PPI-BP takes 30s per epoch on a single GTX 1080 GPU |
| Software Dependencies | No | We use PyTorch Lightning [17] for modularity and reproducibility. All experiments were implemented in Python 3.7.4. (Missing versions for key libraries like PyTorch Lightning or PyTorch itself.) |
| Experiment Setup | Yes | We train all models for 100 epochs, with early stopping enabled based on validation loss with a patience of 10. We use the Adam optimizer with default parameters (β1 = 0.9, β2 = 0.999, = 1e−8). The learning rate is initialized to 1e−3 and reduced by a factor of 0.5 every 25 epochs. We perform a random search for batch sizes from {1, 2, 4, 8, 16, 32, 64, 128} and report the best results. |