A No-go Theorem for Robust Acceleration in the Hyperbolic Plane

Authors: Linus Hamilton, Ankur Moitra

NeurIPS 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical Here we prove that in a noisy setting, there is no analogue of accelerated gradient descent for geodesically convex functions on the hyperbolic plane. Our results apply even when the noise is exponentially small. The key intuition behind our proof is short and simple: In negatively curved spaces, the volume of a ball grows so fast that information about the past gradients is not useful in the future.
Researcher Affiliation Academia Department of Mathematics, Massachusetts Institute of Technology. Email: luh@mit.edu. This work was supported in part by a Fannie and John Hertz Foundation Fellowship. Department of Mathematics, Massachusetts Institute of Technology. Email: moitra@mit.edu.
Pseudocode No The paper does not contain any structured pseudocode or algorithm blocks.
Open Source Code No The paper states 'N/A' for questions related to including code and data, and no explicit statement or link for code release is found.
Open Datasets No The paper is theoretical and does not conduct experiments involving datasets; hence, there is no mention of dataset availability for training.
Dataset Splits No The paper is theoretical and does not conduct experiments involving data splits.
Hardware Specification No The paper is theoretical and does not describe any specific hardware used for experiments.
Software Dependencies No The paper is theoretical and does not specify software dependencies with version numbers.
Experiment Setup No The paper is theoretical and does not describe an experimental setup with hyperparameters or training settings.