Stochastic Mirror Descent in Variationally Coherent Optimization Problems

Authors: Zhengyuan Zhou, Panayotis Mertikopoulos, Nicholas Bambos, Stephen Boyd, Peter W. Glynn

NeurIPS 2017 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental simulation results are also presented. See Figure 2 for a simulation example. Figure 3: SMD run on the objective function of Fig. 2
Researcher Affiliation Academia Zhengyuan Zhou Stanford University zyzhou@stanford.edu Panayotis Mertikopoulos Univ. Grenoble Alpes, CNRS, Inria, LIG panayotis.mertikopoulos@imag.fr Nicholas Bambos Stanford University bambos@stanford.edu Stephen Boyd Stanford University boyd@stanford.edu Peter Glynn Stanford University glynn@stanford.edu
Pseudocode Yes Algorithm 1 Stochastic mirror descent (SMD) Algorithm 2 Mirror descent (MD)
Open Source Code No The paper does not provide any explicit statements about releasing source code or links to a code repository.
Open Datasets No The paper describes simulations on a synthetic objective function (g(r, θ) from Figure 2) but does not use or provide access information for any publicly available or open dataset.
Dataset Splits No The paper does not specify any dataset split information (e.g., percentages or counts for training, validation, or test sets).
Hardware Specification No The paper does not provide specific details about the hardware (e.g., GPU/CPU models, memory) used for running its simulations.
Software Dependencies No The paper does not list specific software dependencies with version numbers (e.g., programming languages, libraries, or solvers).
Experiment Setup Yes Figure 3: SMD run on the objective function of Fig. 2 with γn n 1/2 and Gaussian random noise with standard deviation about 150% the mean value of the gradient.