Fast Convergence of Langevin Dynamics on Manifold: Geodesics meet Log-Sobolev

Authors: Xiao Wang, Qi Lei, Ioannis Panageas

NeurIPS 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical Our work generalizes the results of Vempala and Wibisono [2019] where f is defined on a manifold M rather than Rn. From technical point of view, we show that KL decreases in a geometric rate whenever the distribution e f satisfies a log-Sobolev inequality on M. Our main technical contributions are: A non-asymptotic convergence guarantee for Geodesic Langevin algorithm on closed manifold is provided, with the help of log-Sobolev inequality.
Researcher Affiliation Academia Xiao Wang SUTD xiao_wang@sutd.edu.sg Qi Lei Princeton qilei@princeton.edu Ioannis Panageas UC Irvine ipanagea@uci.edu
Pseudocode No The paper describes the steps of the Geodesic Langevin Algorithm, but it does not present them in a formally structured pseudocode block or labeled 'Algorithm'.
Open Source Code No The paper does not contain any explicit statements or links indicating that open-source code for the described methodology is provided.
Open Datasets No The paper is theoretical and does not conduct experiments on datasets, thus no information on publicly available training datasets is provided.
Dataset Splits No The paper is theoretical and does not conduct experiments that would involve dataset validation splits.
Hardware Specification No The paper does not mention any specific hardware used for experiments, as it is a theoretical work.
Software Dependencies No The paper does not specify any software dependencies with version numbers, as it is a theoretical work.
Experiment Setup No The paper is theoretical and does not provide specific experimental setup details such as hyperparameters or training configurations for empirical evaluation.