Defending Graph Convolutional Networks against Dynamic Graph Perturbations via Bayesian Self-Supervision

Authors: Jun Zhuang, Mohammad Al Hasan4405-4413

AAAI 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Extensive experiments demonstrate that Graph SS can not only affirmatively alert the perturbations on dynamic graphs but also effectively recover the prediction of a node classifier when the graph is under such perturbations. These two advantages prove to be generalized over three classic GCNs across five public graph datasets.
Researcher Affiliation Academia Jun Zhuang, Mohammad Al Hasan Indiana University-Purdue University Indianapolis, IN, USA junz@iu.edu, alhasan@iupui.edu
Pseudocode Yes The pseudo-code of Graph SS is shown in Algorithm (1).
Open Source Code Yes Source code: https://github.com/junzhuang-code/Graph SS
Open Datasets Yes Our proposed model is evaluated on five public datasets in Table 1. Cora, Citeseer, and Pubmed are famous citation graph data (Sen et al. 2008). AMZcobuy comes from the photo segment of the Amazon co-purchase graph (Shchur et al. 2018). Coauthor is co-authorship graphs of computer science based on the Microsoft Academic Graph from the KDD Cup 2016 challenge 2.
Dataset Splits Yes For all five datasets, the percentage of train, validation, and test partition comprise 40%, 20%, and 40% of the nodes, respectively.
Hardware Specification No The paper does not provide specific hardware details such as CPU/GPU models, memory, or cloud instance types used for experiments.
Software Dependencies No The paper mentions models like GCN, SGC, and Graph SAGE, but does not provide specific software dependencies with version numbers (e.g., Python 3.x, PyTorch x.x).
Experiment Setup Yes We set the number of training epochs as 200 because all node classifiers converge within 200 epochs in the training phase. We fix the number of inference epochs as 100 and then apply grid search to look for the optimal parameters, where WS [5, 80] and Retrain [20, 100].