Attributed Graph Clustering: A Deep Attentional Embedding Approach

Authors: Chun Wang, Shirui Pan, Ruiqi Hu, Guodong Long, Jing Jiang, Chengqi Zhang

IJCAI 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experimental results compared with state-of-the-art algorithms demonstrate the superiority of our method.The experimental results show that our algorithm outperforms state-of-the-art graph clustering methods.
Researcher Affiliation Academia 1Centre for Artificial Intelligence, University of Technology Sydney, Australia 2Faculty of IT, Monash University, Australia {chun.wang-1, ruiqi.hu}@student.uts.edu.au, shirui.pan@monash.edu, {guodong.long, jing.jiang, chengqi.zhang}@uts.edu.au
Pseudocode Yes Algorithm 1 Deep Attentional Embedded Graph Clustering
Open Source Code No The paper does not provide any specific links or statements about the availability of its source code.
Open Datasets Yes We used three standard citation networks widely-used for assessment of attributed graph analysis in our experiments, summarized in Table 1. Publications in the datasets are categorized by the research sub-fields. Cora 2,708 1,433 7 5,429 3,880,564 Citeseer 3,327 3,703 6 4,732 12,274,336 Pubmed 19,717 500 3 44,338 9,858,500
Dataset Splits No The paper mentions using benchmark datasets (Cora, Citeseer, Pubmed) but does not specify the train/validation/test splits used for these datasets within the text.
Hardware Specification No The paper does not provide specific details about the hardware used for running the experiments (e.g., GPU models, CPU types, memory).
Software Dependencies No The paper does not specify the version numbers for any software dependencies or libraries used in the experiments.
Experiment Setup Yes For our method, we set the clustering coefficient γ to 10. We consider second-order neighbors and set M = (B+B2)/2. The encoder is constructed with a 256-neuron hidden layer and a 16-neuron embedding layer for all datasets.