Graph Posterior Network: Bayesian Predictive Uncertainty for Node Classification

Authors: Maximilian Stadler, Bertrand Charpentier, Simon Geisler, Daniel Zügner, Stephan Günnemann

NeurIPS 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We extensively evaluate GPN and a strong set of baselines on semi-supervised node classification including detection of anomalous features, and detection of left-out classes.
Researcher Affiliation Academia Maximilian Stadler , Bertrand Charpentier , Simon Geisler, Daniel Zügner, Stephan Günnemann Department of Informatics Technical University of Munich, Germany {stadlmax, charpent, geisler, zuegnerd, guennemann}@in.tum.de
Pseudocode No The paper describes the architecture of GPN and its steps in text and diagrams (Fig. 2), but does not contain structured pseudocode or algorithm blocks.
Open Source Code Yes Project page including code at https://www.daml.in.tum.de/graph-postnet
Open Datasets Yes It contains common citation network datasets (i.e. Cora ML [65, 32, 31, 85], Cite Seer [32, 31, 85], Pub Med [73], Coauthor Physics [87] Coauthor CS [87]) and co-purchase datasets (i.e. Amazon Photos [64, 87], Amazon Computers [64, 87]).
Dataset Splits Yes The results are averaged over 10 initialization splits with a train/val/test split of 5%/15%/80% using stratified sampling.
Hardware Specification No The paper states resource constraints ('We set a threshold of 64 Gi B and 12 hours per training run') but does not provide specific details about the hardware used, such as CPU or GPU models.
Software Dependencies No The paper mentions software components like 'MLP', 'radial normalizing flows', 'PPR', and cites 'TensorFlow' and 'PyTorch Geometric', but it does not specify version numbers for these or any other ancillary software dependencies used in the experiments.
Experiment Setup Yes We used early stopping and report the used hyperparameters in appendix. Further model details are given in appendix.