Optimal and Adaptive Monteiro-Svaiter Acceleration

Authors: Yair Carmon, Danielle Hausler, Arun Jambulapati, Yujia Jin, Aaron Sidford

NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Finally, we report empirical results (Section 4).3 On logistic regression problems, combining our optimal acceleration scheme with our adaptive oracle outperforms previously proposed accelerated second-order methods.
Researcher Affiliation Academia Tel Aviv University, ycarmon@tauex.tau.ac.il, hausler@mail.tau.ac.il Stanford University, {jmblpati,yujiajin,sidford}@stanford.edu
Pseudocode Yes Algorithm 1: Optimal MS Acceleration
Open Source Code Yes 3The code for our experiments is available at https://github.com/danielle-hausler/ms-optimal.
Open Datasets Yes On logistic regression on the a9a dataset [15]
Dataset Splits No The paper mentions using specific datasets but does not provide explicit details on how they were split into training, validation, and test sets (e.g., percentages, sample counts, or references to predefined splits).
Hardware Specification No The paper does not provide specific hardware details (exact GPU/CPU models, processor types, or memory amounts) used for running its experiments.
Software Dependencies No The paper mentions using 'L-BFGS-B from Sci Py' and refers to 'Scikit-learn: Machine learning in Python' in its references, but does not specify version numbers for these or other software dependencies crucial for replication.
Experiment Setup Yes For all runs, we set the initial point x0 = 0 and use a batch size of 100 for mini-batch stochastic gradient descent. We implement the adaptive methods using λ0 = 1, σ = 0.5, and = 2.