Diffusion Improves Graph Learning
Authors: Johannes Gasteiger, Stefan Weißenberger, Stephan Günnemann
NeurIPS 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We demonstrate that replacing message passing with graph diffusion convolution consistently leads to significant performance improvements across a wide range of models on both supervised and unsupervised tasks and a variety of datasets. Furthermore, GDC is not limited to GNNs but can trivially be combined with any graph-based model or algorithm (e.g. spectral clustering) without requiring any changes to the latter or affecting its computational complexity. Our implementation is available online. 1 |
| Researcher Affiliation | Academia | Johannes Gasteiger, Stefan Weißenberger, Stephan Günnemann Technical University of Munich {j.gasteiger,stefan.weissenberger,guennemann}@in.tum.de |
| Pseudocode | No | The paper does not contain any explicit pseudocode or algorithm blocks. |
| Open Source Code | Yes | Our implementation is available online. 1 https://www.daml.in.tum.de/gdc |
| Open Datasets | Yes | We evaluate GDC on six datasets: The citation graphs CITESEER [66], CORA [48], and PUBMED [53], the co-author graph COAUTHOR CS [67], and the co-purchase graphs AMAZON COMPUTERS and AMAZON PHOTO [47, 67]. |
| Dataset Splits | Yes | We optimize the hyperparameters of all models on all datasets with both the unmodified graph and all GDC variants separately using a combination of grid and random search on the validation set. Each result is averaged across 100 data splits and random initializations for supervised tasks and 20 random initializations for unsupervised tasks, as suggested by Gasteiger et al. [23] and Shchur et al. [67]. |
| Hardware Specification | No | The paper does not provide specific details on the hardware used (e.g., GPU/CPU models, memory specifications) for running the experiments. |
| Software Dependencies | No | The paper mentions software like 'Py Torch Geometric' [21], 'Py Torch' [58], 'Sci Py' [30], 'scikit-learn' [59], 'Num Py' [72], and 'graph-tool' [60], but does not provide specific version numbers for any of these dependencies. |
| Experiment Setup | Yes | We optimize the hyperparameters of all models on all datasets with both the unmodified graph and all GDC variants separately using a combination of grid and random search on the validation set. ... Dataset statistics and hyperparameters are reported in App. B. ... Fig. 8 shows that their optimal hyperparameters typically fall within a narrow range of α [0.05, 0.2] and t [1, 10]. |