Quantum Algorithm for Online Exp-concave Optimization

Authors: Jianhao He, Chengchang Liu, Xutong Liu, Lvzhou Li, John C.S. Lui

ICML 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical We present quantum online quasi-Newton methods to tackle the problem and show that there exists quantum advantages for such problems. Our method approximates the Hessian by quantum estimated inexact gradient and can achieve O(n log T) regret with O(1) queries at each round, where n is the dimension of the decision set and T is the total decision rounds. Such regret improves the optimal classical algorithm by a factor of T 2/3. The proof is given in Appendix B.1.
Researcher Affiliation Academia 1Department of Computer Science and Engineering, The Chinese University of Hong Kong, Hong Kong, China 2School of Computer Science and Engineering, Sun Yat-sen University, Guangzhou, China.
Pseudocode Yes Algorithm 1 Online Newton Step Method (Hazan et al., 2007), Algorithm 2 Quantum Online Quasi-Newton Method (QONS), Algorithm 3 Quantum Adaptive Gradient Method, Algorithm 4 Online Newton Method with Hessian Update
Open Source Code No No explicit statement about providing open-source code or a link to a code repository for the described methodology was found.
Open Datasets No This paper is theoretical and does not involve training models on datasets.
Dataset Splits No This paper is theoretical and does not involve dataset splits for training, validation, or testing.
Hardware Specification No This paper is theoretical and does not report on experiments requiring hardware specifications.
Software Dependencies No This paper is theoretical and does not mention specific software dependencies with version numbers.
Experiment Setup No This paper is theoretical and does not include details about an experimental setup or hyperparameters.