Reliable Graph Neural Networks via Robust Aggregation
Authors: Simon Geisler, Daniel Zügner, Stephan Günnemann
NeurIPS 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Empirically, our method improves the robustness of its base architecture w.r.t. structural perturbations by up to 550% (relative), and outperforms previous state-of-the-art defenses. |
| Researcher Affiliation | Academia | Simon Geisler Daniel Zügner Stephan Günnemann Department of Informatics Technical University of Munich {geisler, zuegnerd, guennemann}@in.tum.de |
| Pseudocode | No | The paper describes algorithms and methods verbally and through equations but does not include structured pseudocode or algorithm blocks. |
| Open Source Code | Yes | The source code is available at https://www.daml.in.tum.de/reliable_gnn_via_robust_aggregation. |
| Open Datasets | Yes | We evaluate these models on Cora ML [47], Citeseer [41], and Pub Med [47] for semisupervised node classification. |
| Dataset Splits | Yes | For each approach and dataset, we rerun the experiment with three different seeds, use each 20 labels per class for training and validation, and report the one-sigma error of the mean. |
| Hardware Specification | Yes | We used one 2.20 GHz core and one Ge Force GTX 1080 Ti (11 Gb). |
| Software Dependencies | No | The paper mentions general tools or frameworks (e.g., GNNs, Deep Robust's implementation) but does not provide specific version numbers for software libraries or dependencies. |
| Experiment Setup | Yes | We set the number of hidden units for all architectures to 64, the learning rate to 0.01, weight decay to 5e 4, and train for 3000 epochs with a patience of 300. For the architectures incorporating our Soft Medoid, we perform a grid search over different temperatures T... In the experiments on Cora ML and Citeseer we use = 0.15 as well as k = 64. We use = 0.15 as well as k = 32 in the Pub Med experiments. |