Local Augmentation for Graph Neural Networks

Authors: Songtao Liu, Rex Ying, Hanze Dong, Lanqing Li, Tingyang Xu, Yu Rong, Peilin Zhao, Junzhou Huang, Dinghao Wu

ICML 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Extensive experiments and analyses show that local augmentation consistently yields performance improvement when applied to various GNN architectures across a diverse set of benchmarks.
Researcher Affiliation Collaboration Songtao Liu 1 Rex Ying 2 Hanze Dong 3 Lanqing Li 4 Tingyang Xu 4 Yu Rong 4 Peilin Zhao 4 Junzhou Huang 4 Dinghao Wu 1 1The Pennsylvania State University 2Stanford University 3Hong Kong University of Science and Technology 4Tencent AI Lab.
Pseudocode Yes Algorithm 1 Local Augmentation for Graph Neural Networks
Open Source Code Yes Code is available at https://github.com/ Songtao Liu0823/LAGNN.
Open Datasets Yes We utilize three public citation network datasets Cora, Citeseer, and Pubmed (Sen et al., 2008) for semi-supervised node classification. All the dataset statistics can be found in Appendix D.
Dataset Splits Yes We apply the standard fixed splits (Yang et al., 2016) on Cora, Citeseer, and Pubmed, with 20 nodes per class for training, 500 nodes for validation, and 1,000 nodes for testing.
Hardware Specification Yes All the experiments in this work are conducted on a single NVIDIA Tesla V100 with 32GB memory size.
Software Dependencies Yes The software that we use for experiments are Python 3.6.8, pytorch 1.9.0, pytorch-cluster 1.5.9, pytorch-scatter 2.0.9, pytorch-sparse 0.6.12, pyg 2.0.3, ogb 1.3.2, dgl 0.7.2, numpy 1.19.2, torchvision 0.10.0, CUDA 10.2.89, and CUDNN 7.6.5.
Experiment Setup Yes More details about hyparatemeters can be found in Table 10 and 11.