Graph Neural Convection-Diffusion with Heterophily

Authors: Kai Zhao, Qiyu Kang, Yang Song, Rui She, Sijie Wang, Wee Peng Tay

IJCAI 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We conduct extensive experiments, which suggest that our framework can achieve competitive performance on node classification tasks for heterophilic graphs, compared to the state-of-the-art methods.
Researcher Affiliation Collaboration Kai Zhao1 , Qiyu Kang1 , Yang Song2 , Rui She1 , Sijie Wang1 and Wee Peng Tay1 1Nanyang Technological University 2C3.AI
Pseudocode Yes Algorithm 1 Neural CDE Inference
Open Source Code Yes The code is available at https://github.com/zknus/Graph-Diffusion-CDE.
Open Datasets Yes The paper [Pei et al., 2019] evaluates the performance of their model on six heterophilic graph datasets: Squirrel, Chameleon, Actor, Texas, Cornell, and Wisconsin. ... The datasets Cornell, Texas, and Wisconsin 1 do not have this data leakage issue but are relatively small and have significantly imbalanced classes... 1Available in http://www.cs.cmu.edu/afs/cs.cmu.edu/project/theo11/www/wwkb ... we additionally include six new heterophilic datasets proposed in [Platonov et al., 2023].
Dataset Splits Yes For these heterophilic datasets, we follow the data splitting in [Platonov et al., 2023], which is 50%, 25%, and 25% for training, validation, and testing.
Hardware Specification Yes OOM refers to out-of-memory on NVIDIA RTX A5000 GPU.
Software Dependencies No The paper mentions using the Adam optimizer [Kingma and Ba, 2014] and solving the neural PDE through methods in [Chen et al., 2018], but does not provide specific version numbers for software dependencies like Python, PyTorch, or CUDA.
Experiment Setup Yes For all these datasets, we use the Adam optimizer [Kingma and Ba, 2014] with a learning rate of 0.01 and weight decay of 0.001. We also apply a dropout rate of 0.2 to prevent overfitting issues.