Efficient Sampling on Riemannian Manifolds via Langevin MCMC

Authors: Xiang Cheng, Jingzhao Zhang, Suvrit Sra

NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical We study the task of efficiently sampling from a Gibbs distribution dπ = e hdvolg over a Riemannian manifold M via (geometric) Langevin MCMC; this algorithm involves computing exponential maps in random Gaussian directions and is efficiently implementable in practice. The key to our analysis of Langevin MCMC is a bound on the discretization error of the geometric Euler-Murayama scheme, assuming h is Lipschitz and M has bounded sectional curvature. Our error bound matches the error of Euclidean Euler-Murayama in terms of its stepsize dependence. Combined with a contraction guarantee for the geometric Langevin Diffusion under Kendall-Cranston coupling, we prove that the Langevin MCMC iterates lie within ε-Wasserstein distance of π after O(ε 2) steps, which matches the iteration complexity for Euclidean Langevin MCMC.
Researcher Affiliation Academia Xiang Cheng Massachusetts Institute of Technology x.cheng@berkeley.edu Jingzhao Zhang Tsinghua University jzhzhang@mit.edu Suvrit Sra Massachusetts Institute of Technology suvrit@mit.edu
Pseudocode No The paper describes mathematical models and processes but does not include any pseudocode or algorithm blocks.
Open Source Code No The paper does not provide any explicit statement about open-source code release or a link to a code repository.
Open Datasets No This is a theoretical paper and does not involve experimental evaluation on datasets. Therefore, no information about publicly available datasets is provided.
Dataset Splits No This is a theoretical paper and does not involve experimental evaluation on datasets, so there are no training/validation/test splits mentioned.
Hardware Specification No The paper is theoretical and does not describe computational experiments, so no hardware specifications are mentioned.
Software Dependencies No The paper is theoretical and does not specify software dependencies with version numbers for experimental reproducibility.
Experiment Setup No The paper is theoretical and does not describe an experimental setup, hyperparameters, or system-level training settings.