Improved Dynamic Graph Learning through Fault-Tolerant Sparsification

Authors: Chunjiang Zhu, Sabine Storandt, Kam-Yiu Lam, Song Han, Jinbo Bi

ICML 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Extensive experiments have confirmed the computational efficiencies and accuracies of the proposed methods for learning on dynamic graphs. We empirically show that the computational cost for maintaining spectral sparsifiers can be significantly improved for dynamic graphs, while the accuracy of subsequent Laplacian-regularized estimation and graph SSL are not significantly affected. Dataset. We used the Facebook social network data from the SNAP.
Researcher Affiliation Academia 1University of Connecticut 2University of Konstanz 3City University of Hong Kong.
Pseudocode Yes Algorithm 1 Light-FTSS Algorithm 2 FTSS
Open Source Code No The paper does not provide an explicit statement about releasing the source code for the described methodology or a link to a code repository.
Open Datasets Yes Dataset. We used the Facebook social network data from the SNAP 3. The numbers of vertices and edges in the egocombined graph are 4309 and 88234, respectively. [...] 3https://snap.stanford.edu/data/ego-Facebook.html
Dataset Splits No The paper mentions using a number of revealed labeled vertices for Graph SSL (l = 1000 with 500 ones and 500 zeros) and varying parameters for experiments, but it does not explicitly provide training/validation/test dataset splits (e.g., percentages or sample counts for each split) needed for reproduction.
Hardware Specification No The paper does not provide specific hardware details (e.g., CPU/GPU models, memory, or cloud instance types) used for running its experiments.
Software Dependencies No The paper describes algorithms and experiments but does not provide specific ancillary software details with version numbers (e.g., programming languages, libraries, or solvers) needed to replicate the experiment.
Experiment Setup Yes For Laplacian-regularized estimation, the parameter λ {10 3, 10 2, 10 1, 100}, while for graph SSL, λ {10 6, 10 4, 10 2, 100}. We will only report the smallest error achieved by different λ. In graph SSL, the labels were the sign of the signal β , and the number of revealed labeled vertices l = 1000 with 500 ones and 500 zeros, following (Calandriello et al., 2018). For our algorithm, the parameters were set to f {1, 3, 5, 7}, ϵ = 0.2 and ρ = 20.