Notice: The reproducibility variables underlying each score are classified using an automated LLM-based pipeline, validated against a manually labeled dataset. LLM-based classification introduces uncertainty and potential bias; scores should be interpreted as estimates. Full accuracy metrics and methodology are described in [1].
Quantum Algorithms and Lower Bounds for Finite-Sum Optimization
Authors: Yexin Zhang, Chenyi Zhang, Cong Fang, Liwei Wang, Tongyang Li
ICML 2024 | Venue PDF | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | We give a quantum algorithm with complexity O n + โ/ยต n1/3d1/3 + n 2/3d5/6 ,1 improving the classical tight bound ฮ n + p nโ/ยต . We also prove a quantum lower bound โฆ(n + n3/4(โ/ยต)1/4) when d is large enough. |
| Researcher Affiliation | Academia | 1School of Electronics Engineering and Computer Science, Peking University, China 2Computer Science Department, Stanford University, USA 3National Key Lab of General Artificial Intelligence, School of Intelligence Science and Technology, Peking University 4Institute for Artificial Intelligence, Peking University 5Center on Frontiers of Computing Studies, Peking University, China 6School of Computer Science, Peking University, China. |
| Pseudocode | Yes | Algorithm 1: Q-Katyusha |
| Open Source Code | No | The paper does not contain any explicit statements about releasing source code or links to a code repository. |
| Open Datasets | No | The paper is theoretical and does not conduct experiments with datasets. |
| Dataset Splits | No | The paper is theoretical and does not conduct experiments involving dataset splits for training, validation, or testing. |
| Hardware Specification | No | The paper is theoretical and does not describe any specific hardware used for experiments. |
| Software Dependencies | No | The paper is theoretical and does not list any specific software dependencies with version numbers. |
| Experiment Setup | No | The paper is theoretical and does not describe any experimental setup details such as hyperparameters or training configurations. |