Non-Ergodic Alternating Proximal Augmented Lagrangian Algorithms with Optimal Rates

Authors: Quoc Tran Dinh

NeurIPS 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We verify our algorithms on different numerical examples and compare them with some state-of-the-art methods.
Researcher Affiliation Academia Quoc Tran-Dinh Department of Statistics and Operations Research, University of North Carolina at Chapel Hill Address: Hanes Hall 333, UNC-Chapel Hill, NC27599, USA. Email: quoctd@email.unc.edu
Pseudocode Yes Algorithm 1 (Non-Ergodic Alternating Proximal Augmented Lagrangian Algorithm (NEAPAL))
Open Source Code No The paper does not contain any explicit statement about releasing source code or a link to a code repository for the methodology described.
Open Datasets No The paper describes generating synthetic data for Square-root LASSO and pre-processing logo images for low-rank matrix recovery, but does not provide concrete access information (link, DOI, formal citation with author/year) to a publicly available or open dataset used in their experiments.
Dataset Splits No The paper does not provide specific details on dataset splits (e.g., percentages, sample counts for training, validation, or test sets) or mention cross-validation setup.
Hardware Specification Yes All the experiments are implemented in Matlab R2014b, running on a Mac Book Pro. Retina, 2.7GHz Intel Core i5 with 16Gb RAM.
Software Dependencies Yes All the experiments are implemented in Matlab R2014b, running on a Mac Book Pro.
Experiment Setup Yes For ASGARD, we use the same setting as in [23], and for Chambolle-Pock s (CP) method, we use step-sizes σ = = k Bk 1 and = 1. In Algorithm 1, we choose 0 := kλ?k k Bkky0 y?k as suggested by Theorem 3.1 to trade-off the objective residual and feasibility gap... In Algorithm 2, we set 0 := µg 4k Bk2 as suggested by our theory, where µg := 0.1 σmin(B) as a guess for the restricted strong convexity parameter.