Quantum speedups for stochastic optimization

Authors: Aaron Sidford, Chenyi Zhang

NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical We provide two new methods for the special case of minimizing a Lipschitz convex function. Each method obtains a dimension versus accuracy trade-off which is provably unachievable classically and we prove that one method is asymptotically optimal in low-dimensional settings. Additionally, we provide quantum algorithms for computing a critical point of a smooth non-convex function at rates not known to be achievable classically.
Researcher Affiliation Academia Aaron Sidford and Chenyi Zhang {sidford,chenyiz}@stanford.edu
Pseudocode Yes The paper includes several algorithms presented in pseudocode format, e.g., 'Algorithm 1: Quantum variance reduction', 'Algorithm 2: Quantum accelerated stochastic approximation (Q-AC-SA)'.
Open Source Code No The paper discusses the potential for quantum algorithms to surpass classical ones and mentions standard techniques for implementing quantum analogs of classical oracles (in theory). However, it does not provide any specific link or statement about releasing open-source code for the methods described in this paper.
Open Datasets No The paper focuses on theoretical analysis and algorithm design, presenting new quantum algorithms and their query complexities. It does not conduct empirical studies with specific datasets, therefore, no training dataset information is provided.
Dataset Splits No The paper focuses on theoretical analysis and algorithm design, presenting new quantum algorithms and their query complexities. It does not conduct empirical studies with specific datasets, therefore, no validation split information is provided.
Hardware Specification No The paper focuses on theoretical analysis and algorithm design and does not describe any empirical experiments requiring specific hardware.
Software Dependencies No The paper focuses on theoretical analysis and algorithm design and does not specify any software dependencies for empirical experiments.
Experiment Setup No The paper describes theoretical algorithms and their properties, not an experimental setup with hyperparameters or training settings.