Diffusion Probabilistic Models for Structured Node Classification

Authors: Hyosoon Jang, Seonghyun Park, Sangwoo Mo, Sungsoo Ahn

NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We extensively verify the superiority of our DPM-SNC in diverse scenarios, which include not only the transductive setting on partially labeled graphs but also the inductive setting and unlabeled graphs.
Researcher Affiliation Academia Hyosoon Jang1, Seonghyun Park1, Sangwoo Mo2, Sungsoo Ahn1 1POSTECH 2University of Michigan {hsjang1205,shpark26,sungsoo.ahn}@postech.ac.kr, swmo@umich.edu
Pseudocode Yes Algorithm 1 DPM-SNC
Open Source Code No The paper does not provide any explicit statement or link for open-source code for the described methodology.
Open Datasets Yes In the transductive setting, we conduct experiments on seven benchmarks: Pubmed, Cora, and Citeseer [22]; Photo and Computer [23]; and Empire and Ratings [24].
Dataset Splits Yes For all datasets, 20 nodes per class are used for training, and the remaining nodes are used for validation and testing.Then, we split 30%, 30%, and 40% of the entire nodes into training, validation, and test nodes.
Hardware Specification Yes For all experiments, we use a single GPU of NVIDIA Ge Force RTX 3090.
Software Dependencies No The paper does not provide specific software dependency versions (e.g., library or framework versions like PyTorch 1.9 or Python 3.8).
Experiment Setup Yes We search the learning rate within {1e 3, 5e 3, 1e 2} for all methods.For DPM-SNC, we fix the diffusion step to 100. We also set the size of the buffer to 50 and insert five samples into the buffer for every 30 training step.