Accelerated Variance Reduced Stochastic ADMM

Authors: Yuanyuan Liu, Fanhua Shang, James Cheng

AAAI 2017 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Our experimental results show the effectiveness of ASVRG-ADMM.
Researcher Affiliation Academia Yuanyuan Liu, Fanhua Shang, James Cheng Department of Computer Science and Engineering, The Chinese University of Hong Kong {yyliu, fhshang, jcheng}@cse.cuhk.edu.hk
Pseudocode Yes Algorithm 1 ASVRG-ADMM for strongly-convex case; Algorithm 2 ASVRG-ADMM for general convex case
Open Source Code No No explicit statement or link to the authors' open-source code for the described methodology was found.
Open Datasets Yes We used four publicly available data sets1 in our experiments, as listed in Table 2. Note that except STOC-ADMM, all the other algorithms adopted the linearization of the penalty term β/2||Ax − y + z||2 to avoid the inversion of (1/ηkId1 + βATA) at each iteration, which can be computationally expensive for large matrices. The parameters of ASVRG-ADMM are set as follows: m = 2n/b and γ = 1 as in (Zhong and Kwok 2014b; Zheng and Kwok 2016), as well as η and β. Figure 1 shows the training error (i.e. the training objective value minus the minimum) and testing loss of all the algorithms for the general convex problem on the four data sets. SAG-ADMM could not generate experimental results on the HIGGS data set because it ran out of memory. These figures clearly indicate that the variance reduced stochastic ADMM algorithms (including SAG-ADMM, SCASADMM, SVRG-ADMM and ASVRG-ADMM) converge much faster than those without variance reduction techniques, e.g. STOC-ADMM and OPG-ADMM. Notably, ASVRG-ADMM consistently outperforms all other algorithms in terms of the convergence rate under all settings, which empirically verifies our theoretical result that ASVRG-ADMM has a faster convergence rate of O(1/T 2), as opposed to the best known rate of O(1/T).
Dataset Splits No The paper mentions 'training set' and 'test set' splits but does not explicitly describe a 'validation set' split or its details.
Hardware Specification Yes All methods were implemented in MATLAB, and the experiments were performed on a PC with an Intel i5-2400 CPU and 16GB RAM.
Software Dependencies No All methods were implemented in MATLAB. (No version number for MATLAB or other libraries is provided.)
Experiment Setup Yes The parameters of ASVRG-ADMM are set as follows: m = 2n/b and γ = 1 as in (Zhong and Kwok 2014b; Zheng and Kwok 2016), as well as η and β.