Adaptive Diffusion in Graph Neural Networks

Authors: Jialin Zhao, Yuxiao Dong, Ming Ding, Evgeny Kharlamov, Jie Tang

NeurIPS 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental By directly plugging ADC into existing GNNs, we observe consistent and significant outperformance over both GDC and their vanilla versions across various datasets, demonstrating the improved model capacity brought by automatically learning unique neighborhood size per layer and per channel in GNNs.
Researcher Affiliation Collaboration Jialin Zhao Tsinghua University zjl19970607@gmail.com Yuxiao Dong Microsoft Research ericdongyx@gmail.com Ming Ding Tsinghua University dm18@mails.tsinghua.edu.cn Evgeny Kharlamov Bosch Center for Artificial Intelligence Evgeny.Kharlamov@de.bosch.com Jie Tang Tsinghua University jietang@tsinghua.edu.cn
Pseudocode No The paper includes mathematical equations and descriptions of the approach but no explicitly labeled pseudocode or algorithm blocks.
Open Source Code Yes *Code is available at https://github.com/abcbdf/ADC
Open Datasets Yes We use widely-adopted datasets including CORA, Cite Seer [28], Pub Med [25], Coauthor CS, Amazon Computers and Amazon Photo [29].
Dataset Splits Yes The data is split to a development and test set. ... The development set is split to a training set containing 20 nodes for each class and a validation set with remaining nodes.
Hardware Specification No The paper states 'Did you include the total amount of compute and the type of resources used (e.g., type of GPUs, internal cluster, or cloud provider)? [Yes] See appendix.' but the specific hardware details are not provided in the main text.
Software Dependencies No The paper does not provide specific version numbers for ancillary software dependencies.
Experiment Setup Yes We set the learning rate of t equals to the learning rate of other parameters, which is 0.01. ... The expansion step (K in Eq.15) is set to 10. We use early stopping with patience of 100 epochs.