Fine-Grained Analysis of Stability and Generalization for Stochastic Gradient Descent
Authors: Yunwen Lei, Yiming Ying
ICML 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | In this paper, we provide a fine-grained analysis of stability and generalization for SGD by substantially relaxing these assumptions. Firstly, we establish stability and generalization for SGD by removing the existing bounded gradient assumptions. |
| Researcher Affiliation | Academia | Yunwen Lei 1 2 Yiming Ying 3 1Department of Computer Science, University of Kaiserslautern, Germany 2School of Computer Science, University of Birmingham, United Kingdom 3Department of Mathematics and Statistics, State University of New York at Albany, USA. |
| Pseudocode | No | The paper describes the SGD update rule mathematically (Definition 2) but does not provide a structured pseudocode or algorithm block. |
| Open Source Code | No | The paper does not provide any statement about releasing open-source code or a link to a code repository. |
| Open Datasets | No | The paper discusses 'training examples' theoretically but does not reference or provide access information for any specific, publicly available dataset. |
| Dataset Splits | No | The paper is theoretical and does not perform experiments with specific datasets, therefore no dataset split information is provided. |
| Hardware Specification | No | The paper is theoretical and does not report on experiments, therefore no hardware specifications are mentioned. |
| Software Dependencies | No | The paper is theoretical and does not report on experiments, therefore no software dependencies with version numbers are listed. |
| Experiment Setup | No | The paper is theoretical and does not describe experiments, therefore no experimental setup details are provided. |