Improving Distinguishability of Class for Graph Neural Networks

Authors: Dongxiao He, Shuwei Liu, Meng Ge, Zhizhi Yu, Guangquan Xu, Zhiyong Feng

AAAI 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Extensive experiments on six realworld datasets have shown that the competitive performance of Disc-GNN to the state-of-the-art methods on node classification and node clustering tasks.
Researcher Affiliation Academia 1College of Intelligence and Computing, Tianjin University, Tianjin, China 2Saw Swee Hock School of Public Health, National University of Singapore, Singapore {hedongxiao, liusw, yuzhizhi, losin, zyfeng}@tju.edu.cn, gemeng@nus.edu.sg
Pseudocode No No structured pseudocode or algorithm blocks were found in the paper.
Open Source Code No The paper does not provide any explicit statement about releasing source code or a link to a code repository for the described methodology.
Open Datasets Yes Six real-world datasets with varying sizes and features are used to comprehensively evaluate the performance of our proposed Disc-GNN. The statistical information of datasets is summarized in Table 1. Specifically, the datasets can be divided into two categories: Homophilic datasets: We choose three common citation graphs, i.e., Cora, Citeseer, and Pubmed (Mc Callum et al. 2000; Sen et al. 2008), as homophilic datasets. ... Heterophilic datasets: We also select three web graphs, i.e., Wisconsin, Texas, and Cornell (Pei et al. 2019), as heterophilic graphs.
Dataset Splits No The paper mentions 'a portion of node labels are used for training, while the remaining node labels are masked for testing' but does not specify validation split percentages or absolute counts for training, validation, and test sets.
Hardware Specification No The paper does not provide specific hardware details (e.g., exact GPU/CPU models, processor types, or memory amounts) used for running its experiments.
Software Dependencies No The paper mentions 'Pytorch' and 'Adam optimizer' but does not provide specific version numbers for these or any other software dependencies.
Experiment Setup Yes For Disc-GNN, the hyper-parameter settings are as follow: learning rate is 0.01, dropout in [0.4, 0.6], weight decay in [1e-2, 5e-4], regularization coefficient η in [0, 0.5], relaxation factor ϵ in [ -0.5, 0.5] and µ is 1e-4.