Revisiting the Last-Iterate Convergence of Stochastic Gradient Methods
Authors: Zijian Liu, Zhengyuan Zhou
ICLR 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | Ethics Statement: This is a theory work. Hence, there are no potential ethics concerns. Reproducibility Statement: We include the full proofs of all theorems in the appendix. |
| Researcher Affiliation | Academia | Zijian Liu , Zhengyuan Zhou Stern School of Business, New York University {zl3067,zzhou}@stern.nyu.edu |
| Pseudocode | Yes | Algorithm 1 Composite Stochastic Mirror Descent (CSMD) Input: x1 X, ηt > 0, t [T]. for t = 1 to T do xt+1 = argminx X h(x) + bgt, x xt + Dψ(x,xt) ηt Return x T +1 |
| Open Source Code | No | No explicit statement or link providing access to the source code for the methodology described in this paper is found. |
| Open Datasets | No | The paper is theoretical and does not involve empirical evaluation on datasets. |
| Dataset Splits | No | The paper is theoretical and does not involve dataset splits for validation. |
| Hardware Specification | No | The paper is theoretical and does not describe experimental hardware specifications. |
| Software Dependencies | No | The paper is theoretical and does not mention software dependencies with specific version numbers. |
| Experiment Setup | No | The paper is theoretical and does not describe an experimental setup with specific hyperparameters or training configurations. |