Accelerated Quasi-Newton Proximal Extragradient: Faster Rate for Smooth Convex Optimization
Authors: Ruichen Jiang, Aryan Mokhtari
NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | In this section, we compare the numerical performance of our proposed A-QNPE method with NAG and the classical BFGS quasi-Newton method. Figure 1: Numerical results for logistic regression on a synthetic dataset. |
| Researcher Affiliation | Academia | Ruichen Jiang ECE Department The University of Texas at Austin rjiang@utexas.edu Aryan Mokhtari ECE Department The University of Texas at Austin mokhtari@austin.utexas.edu |
| Pseudocode | Yes | Algorithm 1 Accelerated Quasi-Newton Proximal Extragradient Method |
| Open Source Code | No | The paper does not provide a specific link or explicit statement about the availability of the source code for the described methodology. |
| Open Datasets | No | We perform our numerical experiments on a synthetic dataset and the data generation process is described in Appendix F. In the first experiment of logistic regression, the dataset consists of n data points {(ai, yi)}n i=1... we generate {ai}n i=1 by adding noises and appending an extra dimension to {a i }n i=1. |
| Dataset Splits | No | The paper describes generating synthetic datasets but does not provide any specific information about training, validation, or test splits, nor does it refer to standard splits. |
| Hardware Specification | Yes | All experiments are conducted using MATLAB R2021b on a Mac Book Pro with an Apple M1 chip and 16GB RAM. |
| Software Dependencies | Yes | All experiments are conducted using MATLAB R2021b |
| Experiment Setup | Yes | In our experiment, we set n = 2,000, d = 150 and σ = 0.8. In our experiment, we set n = d = 250. We also use a line search scheme in NAG and BFGS to obtain their best performance. |