EGonc : Energy-based Open-Set Node Classification with substitute Unknowns

Authors: Qin Zhang, Zelin Shi, Shirui Pan, Junyang Chen, Huisi Wu, Xiaojun Chen

NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Comprehensive experimental evaluations of EGonc also demonstrate its superiority.
Researcher Affiliation Academia Qin Zhang1 Zelin Shi1 Shirui Pan2 Junyang Chen1 Huisi Wu1 Xiaojun Chen1 1Shenzhen University 2Griffith University
Pseudocode Yes The algorithm of EGonc is illustrated in Algorithm 1.
Open Source Code Yes Code are available at https://github.com/hiromisyo/EGonc.
Open Datasets Yes Experiments to evaluate the performance for open-set node classification were mainly performed on five benchmark graph datasets [54, 72], namely Cora2, Citeseer3, DBLP4, Pub Med5, and Ogbn_arxiv6 [24, 45], which are widely used citation network datasets.
Dataset Splits Yes 70% of the known class nodes were sampled for training, 10% for validation and 20% for testing.
Hardware Specification Yes All the experiments were conducted on a workstation equipped with an Intel(R) Xeon(R) Gold 6226R CPU and an Nvidia A100.
Software Dependencies No EGonc is implemented with PyTorch and the networks are optimized using stochastic gradient descent with a learning rate of 1e−3. Specific version numbers for PyTorch or other libraries are not provided.
Experiment Setup Yes The GCN is configured with two hidden GCN layers in the dimension of 512 and 128, followed by an additional multilayer perceptron layer of size 64. EGonc is implemented with PyTorch and the networks are optimized using stochastic gradient descent with a learning rate of 1e−3. The balance parameters λ1, λ2 and λ3 are chosen by a grid search in the interval from 10−2 to 102 with a step size of 101.