DropGNN: Random Dropouts Increase the Expressiveness of Graph Neural Networks
Authors: Pál András Papp, Karolis Martinkus, Lukas Faber, Roger Wattenhofer
NeurIPS 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We experimentally validate our theoretical findings on expressiveness. Furthermore, we show that Drop GNNs perform competitively on established GNN benchmarks. |
| Researcher Affiliation | Academia | Pál András Papp ETH Zurich apapp@ethz.ch Karolis Martinkus ETH Zurich martinkus@ethz.ch Lukas Faber ETH Zurich lfaber@ethz.ch Roger Wattenhofer ETH Zurich wattenhofer@ethz.ch |
| Pseudocode | No | The paper does not include pseudocode or algorithm blocks. |
| Open Source Code | Yes | The code is publicly available1. |
| Open Datasets | Yes | We use the datasets from Sato et al. [29] that are based on 3 regular graphs. Nodes have to predict whether they are part of a triangle (TRIANGLES) or have to predict their local clustering coefficient (LCC). We test on the two counterexamples LIMITS 1 (Figure 2a) and LIMITS 2 from Garg et al. [11] |
| Dataset Splits | Yes | Following previous work [22; 21] the data is split into 80% training, 10% validation, and 10% test sets. |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., GPU/CPU models, memory) used for running experiments. |
| Software Dependencies | No | The paper mentions software but does not specify version numbers for key components (e.g., Python, PyTorch) that would be needed for replication. |
| Experiment Setup | Yes | Unless stated otherwise, we set the number of runs to m and choose the dropout probability to be p = 1/m, where m is the mean number of nodes in the graphs in the dataset. |