DGE: Deep Generative Network Embedding Based on Commonality and Individuality
Authors: Sheng Zhou, Xin Wang, Jiajun Bu, Martin Ester, Pinggang Yu, Jiawei Chen, Qihao Shi, Can Wang6949-6956
AAAI 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Extensive experiments on four real-world datasets show the superiority of our proposed DGE framework in various tasks including node classification and link prediction. |
| Researcher Affiliation | Collaboration | 1Zhejiang Provincial Key Laboratory of Service Robot, College of Computer Science, Zhejiang University, Hangzhou, China 2Alibaba-Zhejiang University Joint Institute of Frontier Technologies, Hangzhou, China 3Ningbo Research Institute, Zhejiang University, Ningbo 315100, China 4Tsinghua University, Beijing, China 5Simon Fraser University,Canada |
| Pseudocode | Yes | Algorithm 1 Framework of deep generative network embedding. |
| Open Source Code | Yes | The source code and detailed settings of DGE model can be found in https://github.com/zhoushengisnoob/DGE |
| Open Datasets | Yes | We conduct experiments on three paper citation networks and one social network with different scale of nodes. Table 1 illustrates the details of datasets used in our experiment. Citation datasets We select three datasets of bibliographic network namely Citeseer, Cora and Pubmed... Social network Blog Catalog is a social network... |
| Dataset Splits | No | The paper specifies a 30% training split and the rest for testing ('randomly sample 30% labeled nodes to train a SVM classifier and the rest of the nodes are used to test the model'), but it does not explicitly mention a separate validation split. |
| Hardware Specification | No | The paper does not provide specific details about the hardware used for experiments, such as GPU models, CPU types, or memory specifications. |
| Software Dependencies | No | The paper does not list specific software dependencies with version numbers (e.g., 'Python 3.x', 'PyTorch 1.x'). It mentions general software or baselines without versioning. |
| Experiment Setup | No | The paper discusses tuning the embedding dimension ('We vary the dimension from 2 to 256') and describes the SVM classification setup, but it defers detailed settings for the main DGE model (like learning rates, batch sizes, or optimizer details) to external code ('the detailed setting can be found in our open-sourced code'). |