Stability and Generalization of Stochastic Compositional Gradient Descent Algorithms
Authors: Ming Yang, Xiyuan Wei, Tianbao Yang, Yiming Ying
ICML 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | In this paper, we provide the stability and generalization analysis of stochastic compositional gradient descent algorithms in the framework of statistical learning theory. |
| Researcher Affiliation | Academia | 1Department of Mathematics and Statistics, State University of New York at Albany, Albany, NY 12222, USA 2Department of Computer Science and Engineering, Texas A&M University, College Station, TX 77843, USA 3The University of Sydney, School of Mathematics and Statistics, Sydney, NSW 2006, Australia. |
| Pseudocode | Yes | Algorithm 1 (Stochastically Corrected) Stochastic Compositional Gradient Descent |
| Open Source Code | No | The paper does not provide any statement about making its source code publicly available or provide a link to a code repository. |
| Open Datasets | No | The paper is theoretical and focuses on analysis rather than experimental evaluation, so it does not describe using specific publicly available datasets for training. |
| Dataset Splits | No | The paper is theoretical and does not describe experimental setup details such as training, validation, or test data splits. |
| Hardware Specification | No | The paper is theoretical and does not describe any specific hardware used for experiments. |
| Software Dependencies | No | The paper is theoretical and does not describe specific software dependencies or version numbers for experimental reproducibility. |
| Experiment Setup | No | The paper is theoretical and does not provide specific experimental setup details such as hyperparameters or training configurations. |