Optimal Underdamped Langevin MCMC Method

Authors: Zhengmian Hu, Feihu Huang, Heng Huang

NeurIPS 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experimental results on both synthetic and real-world data show that our new method consistently outperforms the existing ULD approaches.
Researcher Affiliation Academia Zhengmian Hu, Feihu Huang, Heng Huang Department of Electrical and Computer Engineering University of Pittsburgh, Pittsburgh, PA 15213, USA
Pseudocode Yes Algorithm 1: Full gradient ALUM Method
Open Source Code No The paper does not provide an explicit statement about releasing source code for the methodology or a link to a code repository.
Open Datasets Yes data points in australian dataset from LIBSVM [32].
Dataset Splits No The paper uses the 'australian dataset from LIBSVM' but does not specify exact training, validation, or test splits (e.g., percentages or sample counts) or cross-validation setup.
Hardware Specification No The paper does not provide specific hardware details (e.g., GPU/CPU models, memory specifications) used for running the experiments.
Software Dependencies No The paper does not provide specific software dependencies with version numbers (e.g., Python, PyTorch, TensorFlow versions) used in their implementation.
Experiment Setup Yes For the Gaussian model, the potential is defined as: fi(x) = 1/2N (di - x) Σ^-1(di - x) , where di and Σ is generated randomly to satisfy d = 5,N = 100,m = 1 and L = 10. For logistic regression model, the potential is: f(x) = sum_i=1 log(1 + exp(-yiaix)) , where m is set such that κ = 10^4 and yi, ai are data points in australian dataset from LIBSVM [32]. We specify how we generate this reference path in Appendix D.1. The detailed setup can be found in Appendix D.2.