Temporal Network Embedding with High-Order Nonlinear Information

Authors: Zhenyu Qiu, Wenbin Hu, Jia Wu, Weiwei Liu, Bo Du, Xiaohua Jia5436-5443

AAAI 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experimental results on four real-world networks demonstrate the effectiveness of the proposed method.
Researcher Affiliation Academia 1School of Computer Science, Wuhan Univiesity 2Department of Computer Science, City University of Hong Kong 3Department of Computing, Macquarie University 4Shenzhen Research Institute, Wuhan University, China
Pseudocode Yes Algorithm 1 Temporal Random Walk
Open Source Code No The paper does not provide any statement about code availability, nor does it include links to source code repositories or supplementary materials containing code.
Open Datasets Yes We employ four real-world networks to validate the effectiveness of HNIP on four application scenarios. Temporal Networks Leskovec-Ng (Chen and III 2017): DBLP (Zuo et al. 2018): Facebook (Viswanath et al. 2009): Twitter (Conover et al. 2011):
Dataset Splits Yes For each dataset, we divide the dynamic network into two parts by an assigned time point St. The first part is the training set and the latter is the test set. Other hyper-parameters are tuned by using a grid search on the validation set.
Hardware Specification No The paper does not specify any details about the hardware used for running the experiments (e.g., CPU, GPU models, memory, or cloud instances).
Software Dependencies No The paper mentions methods like "Skip-gram model", "KNN algorithm", and "deep belief network" but does not provide specific version numbers for any software dependencies or libraries.
Experiment Setup Yes For HNIP, the walk length T is set to 40, the walk times wt is set to 10, and the restart ratio r is set to 0.2. Other hyper-parameters are tuned by using a grid search on the validation set. For Deep Walk, CTDNE, and Net Walk, we set the window size as 10, the walk length as 40, and the walks times as 10. For SDNE, we set the neural network structure according to Table 2. The embedding size is set to be 128 for all the methods.