Learning Deep Representations for Graph Clustering

Authors: Fei Tian, Bin Gao, Qing Cui, Enhong Chen, Tie-Yan Liu

AAAI 2014 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental The experimental results on various graph datasets show that the proposed method significantly outperforms conventional spectral clustering, which clearly indicates the effectiveness of deep learning in graph clustering.
Researcher Affiliation Collaboration University of Science and Technology of China tianfei@mail.ustc.edu.cn Bin Gao Microsoft Research bingao@microsoft.com Tsinghua University cuiqing1989@gmail.com Enhong Chen University of Science and Technology of China cheneh@ustc.edu.cn Tie-Yan Liu Microsoft Research tyliu@microsoft.com
Pseudocode Yes Table 1: Clustering with Graph Encoder
Open Source Code No No statement regarding the release or availability of open-source code for the methodology was found.
Open Datasets Yes Wine. This is a dataset from UCI Machine Learning Repository (Asuncion and Newman 2007)... DIP. This is an unweighted protein-protein interaction (PPI) network from the Database of Interacting Proteins (Salwinski et al. 2004)... Bio Grid. The last dataset is another PPI network obtained from the Bio Grid Database (Stark et al. 2011).
Dataset Splits No No explicit details on train/validation/test dataset splits (e.g., percentages, sample counts, or specific split methodologies) were found.
Hardware Specification No No specific hardware details (e.g., GPU/CPU models, memory, or specific computing infrastructure) used for running experiments were mentioned.
Software Dependencies No No specific software dependencies with version numbers (e.g., programming languages, libraries, or frameworks with their versions) were provided.
Experiment Setup Yes Table 3: Neural Network Structures (which lists specific layer dimensions for each dataset). In our experiments, we tuned two parameters of SAE: sparsity penalty... and sparsity target...