Convergence of Common Proximal Methods for L1-Regularized Least Squares

Authors: Shaozhe Tao, Daniel Boley, Shuzhong Zhang

IJCAI 2015 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental 5 Numerical Examples We consider examples of compressed sensing to show different convergence behaviors to support our analysis. ... The numerical results are summarized in Table 1. ... Fig. 1 illustrates the methods behavior for the instance marked in Table 1, to show how the theorems established before explains the behaviors in practice.
Researcher Affiliation Academia Shaozhe Tao, Daniel Boley, Shuzhong Zhang University of Minnesota, Minneapolis MN 55455 USA {taoxx120,boley,zhangs}@umn.edu
Pseudocode Yes Algorithm 1: One pass of modified ADMM
Open Source Code No The paper does not provide an explicit statement or link indicating that the source code for the described methodology is publicly available.
Open Datasets No The paper describes generating synthetic data ('We let A Rm n be Gaussian matrix whose elements are i.i.d distributed as N(0, 1), ϵ be a vector whose elements are i.i.d distributed as N(0, σ2) with σ = 10 3.') rather than using a publicly available dataset with concrete access information.
Dataset Splits No The paper describes synthetic data generation but does not provide specific training/validation/test dataset splits, percentages, or references to predefined splits.
Hardware Specification No The paper does not provide specific hardware details (such as exact GPU/CPU models or memory amounts) used for running its experiments.
Software Dependencies No The paper does not provide specific ancillary software details, such as library or solver names with version numbers, needed to replicate the experiment.
Experiment Setup Yes Problem Setting m = 64, n = 512 s = 7, λ = 0.3... We let A Rm n be Gaussian matrix whose elements are i.i.d distributed as N(0, 1), ϵ be a vector whose elements are i.i.d distributed as N(0, σ2) with σ = 10 3.