On the Convergence of mSGD and AdaGrad for Stochastic Optimization
Authors: ruinan Jin, Yu Xing, Xingkang He
ICLR 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | This is a theoretical paper focusing on the investigation for the convergence of m SGD and Ada Grad optimization algorithms. The developed results are provided in the main paper, i.e., Theorems 1-3. In Appendix, the proofs of these theorems are provided together with some useful lemmas as well as their proofs. |
| Researcher Affiliation | Academia | Ruinan Jin LSC, NCMIS, Academy of Mathematics and Systems Science Chinese Academy of Sciences, Beijing 100190, China School of Mathematical Sciences University of Chinese Academy of Sciences, Beijing 100049, China Yu Xing Division of Decision and Control Systems KTH Royal Institute of Technology SE-100 44 Stockholm, Sweden Xingkang He Department of Electrical Engineering University of Notre Dame, IN, USA |
| Pseudocode | No | The paper describes algorithms using mathematical equations, but does not provide structured pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not provide any concrete statement or link regarding the open-sourcing of code for the described methodology. |
| Open Datasets | No | This is a theoretical paper and does not involve experimental evaluation on datasets. |
| Dataset Splits | No | This is a theoretical paper and does not involve experimental evaluation on datasets, thus no training/validation/test splits are mentioned. |
| Hardware Specification | No | This is a theoretical paper and does not describe any experimental setup or hardware used for computations. |
| Software Dependencies | No | This is a theoretical paper and does not describe any specific software dependencies or versions. |
| Experiment Setup | No | This is a theoretical paper and does not describe any experimental setup or hyperparameters. |