Adversarial Directed Graph Embedding

Authors: Shijie Zhu, Jianxin Li, Hao Peng, Senzhang Wang, Lifang He4741-4748

AAAI 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Extensive experiments show that DGGAN consistently and significantly outperforms existing state-of-the-art methods across multiple graph mining tasks on directed graphs.
Researcher Affiliation Academia 1Beijing Advanced Innovation Center for Big Data and Brain Computing, Beihang University, Beijing 100191, China 2State Key Laboratory of Software Development Environment, Beihang University, Beijing 100191, China 3College of Computer Science and Technology, Nanjing University of Aeronautics and Astronautics, Nanjing 211106, China 4Department of Computer Science and Engineering, Lehigh University, Bethlehem, PA, USA
Pseudocode Yes Algorithm 1 DGGAN framework
Open Source Code Yes 1https://github.com/Ring BDStack/DGGAN
Open Datasets Yes We use four different types of directed graphs, including citation network, social network, trust network and hyperlink network to evaluate the performance of the model. The details of the data are described as follows: Cora (ˇSubelj and Bajec 2013) and Co Cit (Tsitsulin et al. 2018) are citation networks of academic papers. Twitter (Choudhury et al. 2010) is a social network. Epinions (Richardson, Agrawal, and Domingos 2003) is a trust network from the online social network Epinions. Google (Palla et al. 2007) is a hyperlink network from pages within Google s sites.
Dataset Splits Yes For DGGAN* and DGGAN, we choose parameters by cross validation and we fix the numbers of generator and discriminator training iterations per epoch n G = 5, n D = 15 across all datasets and tasks.
Hardware Specification No The paper does not provide specific hardware details such as GPU models, CPU types, or memory specifications used for the experiments.
Software Dependencies No The paper mentions implementing methods and setting parameters but does not specify software names with version numbers (e.g., Python 3.x, PyTorch 1.x).
Experiment Setup Yes For Deep Walk, node2vec and APP, the number of walks, the walk length and the window size are set to 10, 80 and 10, respectively, for fair comparision. node2vec is optimized with grid search over its return and in-out parameters (p, q) {0.25, 0.5, 1, 2, 4} on each dataset and task. For LINE, we utilize both the first-order and the second-order proximities. In addition, the number of negative samples is empirically set to 5. For DGGAN* and DGGAN, we choose parameters by cross validation and we fix the numbers of generator and discriminator training iterations per epoch n G = 5, n D = 15 across all datasets and tasks. Throughout our experiments, the dimension of node embeddings is set to 128.