RelaySum for Decentralized Deep Learning on Heterogeneous Data

Authors: Thijs Vogels, Lie He, Anastasiia Koloskova, Sai Praneeth Karimireddy, Tao Lin, Sebastian U. Stich, Martin Jaggi

NeurIPS 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental 5 Experimental analysis and practical properties; In extensive tests on image- and text classification, Relay SGD performs better than both kinds of baselines at equal communication budget.
Researcher Affiliation Academia Thijs Vogels EPFL Lie He EPFL Anastasia Koloskova EPFL Tao Lin EPFL Sai Praneeth Karimireddy EPFL Sebastian U. Stich EPFL Martin Jaggi EPFL
Pseudocode Yes Algorithm 1 Relay SGD
Open Source Code Yes Our code is available at http://github.com/epfml/relaysgd.
Open Datasets Yes Cifar-10 [17]; Image Net [5]; AG news data [49]
Dataset Splits No The paper discusses partitioning training data across workers for heterogeneity ('We partition training data strictly across 16 workers and distribute the classes using a Dirichlet process [47, 20]'), but does not explicitly provide specific train/validation/test dataset splits (e.g., percentages or sample counts).
Hardware Specification No The paper does not provide specific hardware details such as GPU/CPU models or other processor specifications used for running experiments.
Software Dependencies No The paper mentions software components like 'Adam optimizer' and 'VGG-11 architecture' but does not provide specific version numbers for these or any other ancillary software dependencies (e.g., Python, PyTorch, CUDA versions).
Experiment Setup No The paper states 'We use 16-workers on Cifar-10, following the experimental details outlined in Appendix B and hyper-parameter tuning procedure from Appendix C,' indicating setup details are in appendices, but does not provide concrete hyperparameter values or detailed training configurations within the main text provided.