Galaxy Network Embedding: A Hierarchical Community Structure Preserving Approach
Authors: Lun Du, Zhicong Lu, Yun Wang, Guojie Song, Yiming Wang, Wei Chen
IJCAI 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Our experiments reveal that the representations from GNE preserve the hierarchical community structure and show advantages in several applications such as vertex multi-class classification and network visualization. The source code of GNE is available online. ... We conduct extensive experiments on three real-world networks and four synthetic networks with explicit hierarchical structures, and the results demonstrate that our model can integrally preserve the hierarchical community structure and is significantly superior to other models on vertex classification and network visualization. |
| Researcher Affiliation | Collaboration | Lun Du1 , Zhicong Lu1 , Yun Wang1 , Guojie Song1 , Yiming Wang1, Wei Chen2 1 Peking University 2 Microsoft Research {dulun, phyluzhicong, wangyun94, gjsong, wangyiming17}@pku.edu.cn weic@microsoft.com |
| Pseudocode | Yes | Algorithm 1 The GNE algorithm |
| Open Source Code | Yes | The source code of GNE is available online. |
| Open Datasets | Yes | We employ the following three real datasets in the Facebook social networks dataset which comprises 100 colleges and universities in US [Traud et al., 2012]. ... four Hierarchical Random Graphs (HRG) with explicit hierarchical community structure are generated by [Clauset et al., 2008]. |
| Dataset Splits | Yes | Different percentage of nodes are sampled randomly for evaluation, and the rest are for training. The results are averaged over 10 different runs. |
| Hardware Specification | No | The paper states, "The optimization algorithm is implemented on the Tensorflow platform, which can be accelerated with GPU," but does not specify any particular GPU model, CPU, memory, or other hardware details. |
| Software Dependencies | No | The paper mentions using "Tensorflow platform", "Adam optimizer", "Logistic Regression with sklearn package", and "t-SNE package" but does not provide specific version numbers for any of these software dependencies. |
| Experiment Setup | Yes | The hyper-parameters of GNE is θ, i.e. θ = (µ, γ). GNE is not very sensitive to the hyperparameters, the settings of which can achieve ideal results with grid search on a small range. In our experiments, the embedding size m of all models is 64. Besides, the parameter setting of comparison models follow the recommended settings in relevant code packages. |