Communication trade-offs for Local-SGD with large step size
Authors: Aymeric Dieuleveut, Kumar Kshitij Patel
NeurIPS 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | 4) We support our analysis by experiments illustrating the behavior of the algorithms. Results in the on-line setting and experiments are presented in the Appendix A.2 and Appendix B. |
| Researcher Affiliation | Academia | Kumar Kshitij PATEL MLO, EPFL, Lausanne, Switzerland TTIC-Toyota Technological Institute Chicago kkpatel@ttic.edu Aymeric DIEULEVEUT MLO, EPFL, Lausanne, Switzerland CMAP, Ecole Polytechnique, Palaiseau, France aymeric.dieuleveut@polytechnique.edu |
| Pseudocode | Yes | Pseudo-code of the algorithm is given in the Appendix, in Fig. S5. |
| Open Source Code | No | The paper does not include an unambiguous statement about releasing source code for the described methodology, nor does it provide a direct link to a repository. |
| Open Datasets | No | The paper mentions applying its analysis to 'least squares regression' and 'logistic regression' but does not specify or provide access information for any publicly available datasets used in experiments within the provided text. |
| Dataset Splits | No | The paper does not provide specific details about training, validation, or test dataset splits. It mentions experiments in the Appendix but does not detail the splits in the main text. |
| Hardware Specification | No | The paper does not explicitly describe the hardware used to run its experiments. |
| Software Dependencies | No | The paper does not provide specific software dependencies with version numbers (e.g., library names with versions). |
| Experiment Setup | No | The paper does not provide specific experimental setup details such as hyperparameter values (e.g., learning rate, batch size, epochs) or optimizer settings in the main text. |