SDCA without Duality, Regularization, and Individual Convexity
Authors: Shai Shalev-Shwartz
ICML 2016 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | We prove linear convergence rates even if individual loss functions are non-convex, as long as the expected loss is strongly convex. |
| Researcher Affiliation | Academia | Shai Shalev-Shwartz SHAIS@CS.HUJI.AC.IL School of Computer Science and Engineering, The Hebrew University of Jerusalem, Israel |
| Pseudocode | Yes | Algorithm 1: Dual-Free SDCA for Regularized Objectives |
| Open Source Code | No | The paper does not provide any statement or link regarding the availability of open-source code for the described methodology. |
| Open Datasets | No | The paper focuses on theoretical analysis of algorithms and does not describe experiments with specific datasets, thus no access information for training data is provided. |
| Dataset Splits | No | The paper focuses on theoretical analysis and does not describe experiments requiring dataset splits. |
| Hardware Specification | No | The paper is theoretical and does not describe any experiments that would require specific hardware for execution. |
| Software Dependencies | No | The paper is theoretical and does not discuss software implementations or their specific version dependencies. |
| Experiment Setup | No | The paper is theoretical and describes algorithms and their analysis, but it does not detail an experimental setup or hyperparameters for empirical evaluation. |