The First Optimal Acceleration of High-Order Methods in Smooth Convex Optimization

Authors: Dmitry Kovalev, Alexander Gasnikov

NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental In this paper, we study the fundamental open question of finding the optimal highorder algorithm for solving smooth convex minimization problems. ... We fix this fundamental issue by providing the first algorithm with O ϵ 2/(3p+1) p-th order oracle complexity. ... In this paper we provide an experimental comparison of the proposed optimal high-order algorithm for solving smooth convex minimization problems (Algorithm 4) with the existing near-optimal high-order method of Gasnikov et al. (2019a); Bubeck et al. (2019); Jiang et al. (2019) (Algorithm 2). In summary, the experiments show that the proposed optimal Algorithm 4 is indeed a practical algorithm which significantly outperforms the existing near-optimal Algorithm 2. The details can be found in Appendix F.
Researcher Affiliation Academia Dmitry Kovalev KAUST dakovalev1@gmail.com Alexander Gasnikov MIPT , IITP RAS , HSE gasnikov@yandex.ru King Abdullah University of Science and Technology, Thuwal, Saudi Arabia Moscow Institute of Physics and Technology, Dolgoprudny, Russia Institute for Information Transmission Problems RAS, Moscow, Russia National Research University Higher School of Economics, Moscow, Russia
Pseudocode Yes Algorithm 1 A-HPE Framework, Algorithm 2 Near-Optimal Tensor Method, Algorithm 3 Tensor Extragradient Method, Algorithm 4 Optimal Tensor Method
Open Source Code No N/A
Open Datasets No N/A. The paper mentions an 'experimental comparison' in Section 4.5 and Appendix F, but no specific dataset names with access information (citation, link, or repository) are provided in the main text or ethics statement.
Dataset Splits No N/A. The paper mentions an 'experimental comparison' in Section 4.5 and Appendix F, but no specific training, validation, or test splits are detailed in the main text or ethics statement.
Hardware Specification No The paper does not provide any specific hardware specifications used for running the experiments.
Software Dependencies No The paper does not provide specific software dependencies with version numbers used for its experiments.
Experiment Setup No N/A. The paper mentions an 'experimental comparison' in Section 4.5 and Appendix F, but no specific experimental setup details, such as hyperparameters or training configurations, are provided in the main text or ethics statement.