Notice: The reproducibility variables underlying each score are classified using an automated LLM-based pipeline, validated against a manually labeled dataset. LLM-based classification introduces uncertainty and potential bias; scores should be interpreted as estimates. Full accuracy metrics and methodology are described in [1].
Dimension-free Complexity Bounds for High-order Nonconvex Finite-sum Optimization
Authors: Dongruo Zhou, Quanquan Gu
ICML 2022 | Venue PDF | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | More specifically, we propose stochastic high-order algorithms with novel first-order and high-order derivative estimators, which can achieve dimension-free complexity bounds. With the access to p-th order derivatives of the objective function, we prove that our algorithm finds ϵstationary points with O(n(2p 1)/(2p)/ϵ(p+1)/p) high-order oracle complexities |
| Researcher Affiliation | Academia | 1Department of Computer Science, University of California, Los Angeles, CA 90095, USA. Correspondence to: Quanquan Gu <EMAIL>. |
| Pseudocode | Yes | Algorithm 1 High-order Meta Algorithm, Algorithm 2 OP-TE, Algorithm 3 TP-TE |
| Open Source Code | No | The paper does not contain any statement or link indicating that source code for the described methodology is publicly available. |
| Open Datasets | No | The paper presents theoretical work on complexity bounds for optimization problems. It does not mention specific datasets used for training or evaluation, nor does it provide any links or citations to publicly available datasets. |
| Dataset Splits | No | The paper focuses on theoretical analysis and does not describe empirical experiments involving dataset splits for training, validation, or testing. |
| Hardware Specification | No | The paper focuses on theoretical complexity analysis of algorithms. It does not mention any specific hardware used for running experiments. |
| Software Dependencies | No | The paper focuses on theoretical derivations and algorithms. It does not list any software dependencies with specific version numbers. |
| Experiment Setup | No | The paper describes theoretical algorithms and their complexity bounds, but it does not detail an experimental setup, hyperparameters, or training configurations. |