Deep Graph Neural Networks via Posteriori-Sampling-based Node-Adaptative Residual Module

Authors: Jingbo Zhou, Yixuan Du, Ruqiong Zhang, Jun Xia, Zhizhi Yu, Zelin Zang, Di Jin, Carl Yang, Rui Zhang, Stan Z. Li

NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We theoretically demonstrate that PSNR can alleviate the drawbacks of previous residual methods. Furthermore, extensive experiments verify the superiority of the PSNR module in fully observed node classification and missing feature scenarios.
Researcher Affiliation Academia 1Westlake University, 2Jilin University, 3Tianjin University, 4Emory University
Pseudocode No The paper does not contain structured pseudocode or algorithm blocks. Figure 2 is a module framework diagram, not pseudocode.
Open Source Code Yes Our code is available at https://github.com/jingbo02/PSNR-GNN.
Open Datasets Yes We conducted experiments on ten real-world datasets, including three citation network datasets, i.e., Cora, Citeseer, Pubmed [27], two web network datasets, i.e., Chameleon and Squirrel [25], co-author/co-purchase network datasets, i.e., Coauthor-CS [28], Amazon-Photo [28] and three larger datasets, i.e., Flickr [20], Coauthor-Physics [28] and Ogbn-arxiv [10].
Dataset Splits Yes For Cora, Citeseer, Coauthor-CS, Amazon-Photo, we randomly select 20 nodes per class for training set, 500 nodes for validation and 1000 nodes for testing. For Chameleon and Squirrel, we randomly divide each class s nodes into 60%, 20%, and 20% as the train, validation, and test sets, respectively. ... Experiment 5.6. For larger datasets, We randomly divide each class s nodes into 20%, 20%, and 60% as the train, validation, and test sets, respectively.
Hardware Specification Yes Experimental results are obtained from the server with four core Intel(R) Xeon(R) Platinum 8358 CPUs @ 2.60GHZ, one NVIDIA A100 GPU (80G), and models and datasets used in this paper are implemented using the Deep Graph Library (DGL) and Pytorch Geometric (Py G).
Software Dependencies No The paper states that models and datasets are implemented using 'Deep Graph Library (DGL) and Pytorch Geometric (Py G)' but does not specify version numbers for these software dependencies, which is required for reproducibility.
Experiment Setup Yes Further details on the specific parameter settings can be found in Appendix G. ... We summarized the hyperparameters used in different experiments in Table 9. (Table 9 lists Learning Rate, Dropout, Weight Decay, Hidden State, Attention Head, Max Epoch, Early Stopping Epoch for various experiments).