Accelerated Proximal Gradient Methods for Nonconvex Programming

Authors: Huan Li, Zhouchen Lin

NeurIPS 2015 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental In this section, we test the performance of our algorithm on the problem of Sparse Logistic Regression (LR)... We test the performance on the real-sim data set... The result is reported in Table 2. We also plot the curves of objective function values vs. iteration number and CPU time in Figure 1.
Researcher Affiliation Academia Huan Li Zhouchen Lin B Key Lab. of Machine Perception (MOE), School of EECS, Peking University, P. R. China Cooperative Medianet Innovation Center, Shanghai Jiaotong University, P. R. China lihuanss@pku.edu.cn zlin@pku.edu.cn
Pseudocode Yes Algorithm 1 Monotone APG and Algorithm 2 Nonmonotone APG
Open Source Code No The paper does not explicitly state that the source code for the proposed methods (monotone APG, nonmonotone APG) is available or provide a link to it.
Open Datasets Yes We test the performance on the real-sim data set9...9http://www.csie.ntu.edu.cn/~cjlin/libsvmtools/datasets
Dataset Splits No The paper states 'We randomly choose 90% of the data as training data and the rest as test data' but does not specify a separate validation split or explicit proportions for it.
Hardware Specification Yes All algorithms are run on Matlab 2011a and Windows 7 with an Intel Core i3 2.53 GHz CPU and 4GB memory.
Software Dependencies Yes All algorithms are run on Matlab 2011a
Experiment Setup Yes We follow [19] to set λ = 0.0001, θ = 0.1λ and the starting point as zero vectors. In nm APG we set η = 0.8. In IFB the inertial parameter β is set at 0.01 and the Lipschitz constant is computed by backtracking.