Accelerated Mirror Descent in Continuous and Discrete Time
Authors: Walid Krichene, Alexandre Bayen, Peter L. Bartlett
NeurIPS 2015 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We test these methods on numerical examples in Section 5 and comment on their performance. The results are given in Figure 1. |
| Researcher Affiliation | Academia | Walid Krichene UC Berkeley walid@eecs.berkeley.edu Alexandre M. Bayen UC Berkeley bayen@berkeley.edu Peter L. Bartlett UC Berkeley and QUT bartlett@berkeley.edu |
| Pseudocode | Yes | Algorithm 1 Accelerated mirror descent with distance generating function ψ , regularizer R, step size s, and parameter r 3 and Algorithm 2 Accelerated mirror descent with restart |
| Open Source Code | No | The paper does not provide any explicit statements about releasing source code, nor does it include links to a code repository. |
| Open Datasets | No | The paper describes generating data for two different objective functions: a simple quadratic f(x) = x x , Q(x x ) , for a random positive semi-definite matrix Q, and a log-sum-exp function... where each entry in ai Rn and bi R is iid normal. No publicly available dataset is mentioned or linked. |
| Dataset Splits | No | The paper does not discuss dataset splits (training, validation, or test) as it focuses on optimizing generated objective functions rather than using pre-existing datasets with fixed splits. |
| Hardware Specification | No | The paper does not provide any specific details about the hardware (e.g., CPU, GPU models, memory) used for running the experiments. |
| Software Dependencies | No | The paper does not provide specific software dependencies with version numbers (e.g., Python, PyTorch, TensorFlow versions or specific solver versions). |
| Experiment Setup | Yes | We test the accelerated mirror descent method in Algorithm 1, on simplex-constrained problems in Rn, n = 100... (c) Effect of the parameter r. r = 3 r = 10 r = 30 r = 90. The restart conditions are the following: (i) gradient restart: x(k+1) x(k), f(x(k)) > 0, and (ii) speed restart: x(k+1) x(k) < x(k) x(k 1) . |