StopWasting My Gradients: Practical SVRG
Authors: Reza Babanezhad Harikandeh, Mohamed Osama Ahmed, Alim Virani, Mark Schmidt, Jakub Konečný, Scott Sallinen
NeurIPS 2015 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | In this section, we present experimental results that evaluate our proposed variations on the SVRG method. |
| Researcher Affiliation | Academia | Department of Computer Science University of British Columbia, School of Mathematics University of Edinburgh, Department of Electrical and Computer Engineering University of British Columbia |
| Pseudocode | Yes | Algorithm 1 Batching SVRG, Algorithm 2 Mixed SVRG and SG Method, Algorithm 3 Heuristic for skipping evaluations of fi at x |
| Open Source Code | No | The paper does not contain an explicit statement about releasing its source code or a link to a code repository. |
| Open Datasets | Yes | We consider the datasets used by [1], whose properties are listed in the supplementary material. |
| Dataset Splits | No | The paper refers to datasets used by [1] and their properties in the supplementary material but does not explicitly provide specific training, validation, and test split information within the main text. |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., CPU/GPU models, memory) used for running its experiments. |
| Software Dependencies | No | The paper does not provide specific software dependencies with version numbers. |
| Experiment Setup | Yes | As in their work we add a bias variable, normalize dense features, and set the regularization parameter λ to 1/n. We used a step-size of α = 1/L and we used m = |Bs| which gave good performance across methods and datasets. |