Reparameterizing Mirror Descent as Gradient Descent
Authors: Ehsan Amid, Manfred K. K. Warmuth
NeurIPS 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | Our construction for the reparameterization argument is done for the continuous versions of the updates. Finding general criteria for the discrete versions to closely track their continuous counterparts remains an interesting open problem. |
| Researcher Affiliation | Industry | Ehsan Amid and Manfred K. Warmuth Google Research, Brain Team Mountain View, CA {eamid, manfred}@google.com |
| Pseudocode | No | The paper provides mathematical derivations and theorems but no pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not provide concrete access to source code for the described methodology. No links or explicit statements about code release are present. |
| Open Datasets | No | The paper is theoretical and does not report on experiments using datasets. |
| Dataset Splits | No | The paper is theoretical and does not report on experiments or dataset splits. |
| Hardware Specification | No | The paper is theoretical and does not involve empirical experiments, thus no hardware specifications are provided. |
| Software Dependencies | No | The paper is theoretical and does not involve empirical experiments, thus no software dependencies are listed. |
| Experiment Setup | No | The paper is theoretical and does not involve empirical experiments, thus no experimental setup details like hyperparameters are provided. |