Sampling from Structured Log-Concave Distributions via a Soft-Threshold Dikin Walk

Authors: Oren Mangoubi, Nisheeth K. Vishnoi

NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical We present a generalization of the Dikin walk to this setting that requires at most O((md + d L2R2) mdω 1 log( wδ )) arithmetic operations to sample from π within error δ > 0 in the total variation distance from a w-warm start.
Researcher Affiliation Academia Oren Mangoubi Worcester Polytechnic Institute Nisheeth K. Vishnoi Yale University
Pseudocode Yes Algorithm 1: Soft-threshold Dikin walk
Open Source Code No The paper discusses related work and applications that might involve open-source code, but does not provide a statement or link for the source code of the algorithm presented in this paper.
Open Datasets No The paper is theoretical and focuses on algorithm design and runtime analysis. It mentions applications to Bayesian inference and differentially private optimization, which involve datasets, but does not itself conduct experiments or use a specific dataset for training or evaluation.
Dataset Splits No This paper is theoretical and focuses on algorithm design and runtime analysis. It does not describe experiments that would involve dataset splits (training, validation, test).
Hardware Specification No The paper is theoretical and focuses on algorithm design and runtime analysis. It does not describe hardware used for running experiments.
Software Dependencies No The paper is theoretical and focuses on algorithm design and runtime analysis. It does not specify any software dependencies with version numbers.
Experiment Setup No The paper specifies hyperparameters (alpha, eta, T) for its algorithm as part of the theoretical analysis to achieve certain bounds, but these are not presented as settings for an actual experimental setup.