Effective Decoding in Graph Auto-Encoder Using Triadic Closure
Authors: Han Shi, Haozheng Fan, James T. Kwok906-913
AAAI 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experiments on link prediction, node clustering and graph generation show that the use of triads leads to more accurate prediction, clustering and better preservation of the graph characteristics. |
| Researcher Affiliation | Collaboration | 1Department of Computer Science and Engineering Hong Kong University of Science and Technology, Hong Kong 2Amazon |
| Pseudocode | Yes | Algorithm 1 Training the triad variational graph autoencoder (TVGA) and triad graph auto-encoder (TGA) using SGD. |
| Open Source Code | No | The paper does not provide any explicit statement or link regarding the availability of its source code. |
| Open Datasets | Yes | Experiments are performed on three standard benchmark citation graph data sets1 (Sen et al. 2008): Cora, Citeseer, and Pubmed (Table 1). 1http://www.cs.umd.edu/ sen/lbc-proj/LBC.html |
| Dataset Splits | Yes | In this experiment, 85% of the edges and non-edges (unconnected nodes) from each graph are randomly selected to form the training set, another 10% is used as the validation set, and the remaining 5% as testing set. |
| Hardware Specification | No | The paper does not provide specific details about the hardware (e.g., GPU/CPU models, memory) used for running its experiments. |
| Software Dependencies | No | The paper mentions 'Adam (Kingma and Ba 2014) is the optimizer' but does not specify version numbers for any software dependencies or libraries. |
| Experiment Setup | Yes | The proposed algorithm uses a mini-batch size of 5,000. Adam (Kingma and Ba 2014) is the optimizer, with a learning rate of 0.0005. Both the hidden layer and embedding layer of the encoder have 32 hidden units. The convolution layer in the triad decoder has 4 filters (i.e., the dimension of ztriplet is 1 32 4). |