Beyond Homophily: Reconstructing Structure for Graph-agnostic Clustering

Authors: Erlin Pan, Zhao Kang

ICML 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Extensive experiments on 11 benchmark graphs demonstrate our promising performance.
Researcher Affiliation Academia 1School of Computer Science and Engineering, University of Electronic Science and Technology of China, Chengdu, China. Correspondence to: Erlin Pan <wujisixsix6@gmail.com>, Zhao Kang <zkang@uestc.edu.cn>.
Pseudocode No The paper describes mathematical formulations and procedures but does not include a clearly labeled 'Algorithm' or 'Pseudocode' block.
Open Source Code Yes The code is available at DGCN.
Open Datasets Yes To evaluate the effectiveness of the proposed method, we conduct extensive experiments on 11 benchmarks, including homophilic graph datasets, like Cora, Citeseer (Kipf & Welling), ACM (Fan et al., 2020), AMAP (Liu et al., 2022b), EAT (Mrabah et al., 2022); heterophilic graph datasets, like Texas, Cornell, Wisconsin, Washington (Pei et al., 2020), Twitch (Lim et al., 2021b), and Squirel (Rozemberczki et al., 2021).
Dataset Splits No The paper does not specify exact percentages, sample counts, or predefined citations for train/validation/test dataset splits needed for reproduction. It focuses on unsupervised clustering metrics.
Hardware Specification No The paper does not provide any specific hardware details such as GPU models, CPU types, or memory used for running the experiments.
Software Dependencies No The paper mentions the use of 'Adam optimizer' but does not specify any software libraries or their version numbers (e.g., Python, PyTorch, TensorFlow, scikit-learn versions).
Experiment Setup Yes Our network is trained with Adam optimizer for 500 epochs until convergence. The learning rate of optimizer is set to 1e-2. We tune filter order k in [1, 2, 3, 4, 5, 10].