Asynchronous SGD Beats Minibatch SGD Under Arbitrary Delays

Authors: Konstantin Mishchenko, Francis Bach, Mathieu Even, Blake E. Woodworth

NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Figure 1: We ran an experiment on a simple least-squares problem with random data and tuned all stepsizes.
Researcher Affiliation Academia DI ENS, Ecole normale supérieure, Université PSL, CNRS, INRIA 75005 Paris, France
Pseudocode Yes Algorithm 1 Asynchronous SGD
Open Source Code No The paper does not provide an unambiguous statement or a direct link indicating that the source code for the methodology described in this paper is publicly available.
Open Datasets No The paper mentions 'a simple least-squares problem with random data' for an experiment but does not provide concrete access information (specific link, DOI, repository name, formal citation, or reference to established benchmark datasets) for a publicly available or open dataset.
Dataset Splits No The paper does not provide specific dataset split information (exact percentages, sample counts, citations to predefined splits, or detailed splitting methodology) needed to reproduce the data partitioning.
Hardware Specification No The paper does not provide specific hardware details (exact GPU/CPU models, processor types with speeds, memory amounts, or detailed computer specifications) used for running its experiments.
Software Dependencies No The paper does not provide specific ancillary software details (e.g., library or solver names with version numbers like Python 3.8, CPLEX 12.4) needed to replicate the experiment.
Experiment Setup No The paper states 'Additional details about the experiment can be found in Appendix A' but does not provide specific experimental setup details (concrete hyperparameter values, training configurations, or system-level settings) within the main body of the provided text.