Reconciling Competing Sampling Strategies of Network Embedding

Authors: Yuchen Yan, Baoyu Jing, Lihui Liu, Ruijie Wang, Jinning Li, Tarek Abdelzaher, Hanghang Tong

NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental In this section, we evaluate the effectiveness of the proposed algorithm (SENSEI) for solving link prediction and node recommendation simultaneously in plain networks. ...The results of link prediction on all 4 plain networks are presented in Table 1.
Researcher Affiliation Academia Yuchen Yan, Baoyu Jing, Lihui Liu, Ruijie Wang, Jinning Li, Tarek Abdelzaher, Hanghang Tong University of Illinois at Urbana Champaign, IL, USA {yucheny5, baoyuj2, lihuil2, ruijiew2, jinning4, zaher, htong}@illinois.edu
Pseudocode Yes In this section, we give the detailed algorithm of SENSEI in Algorithm 1.
Open Source Code Yes The simplified code of SENSEI is on: https://github.com/yucheny5/SENSEI.
Open Datasets Yes Datasets. We use 4 public real-world datasets to evaluate the proposed SENSEI model: C.ele [51], Cora [39], Citeseer [39], NS [31].
Dataset Splits Yes For link prediction and node recommendation in plain networks, we randomly split edges in every dataset into 70/10/20% for training, validation, and test.
Hardware Specification Yes All experiments are run on a Tesla-V100 GPU.
Software Dependencies No The paper mentions the hardware used but does not provide specific software dependencies with version numbers.
Experiment Setup Yes For SENSEI on 4 datasets: {C.ele, Cora, Citeseer, NS}, we set the threshold τ as {0.008, 0.01, 0.005, 0.05}, the number of epochs in Step 1 as {40, 100, 20, 20}, the number of epochs in Step 2 as {40, 100, 50, 20}, the learning rate in Step 1 as {0.02, 0.1, 0.1, 0.2}, the learning rate in Step 2 as {0.01, 0.01, 0.005, 0.1}, the positive margin γ as {0.05, 0.0001, 0.0001, 0.1} and the negative sample number k as 40 on all 4 datasets.