Unbiased constrained sampling with Self-Concordant Barrier Hamiltonian Monte Carlo
Authors: Maxence Noble, Valentin De Bortoli, Alain Durmus
NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | In our experiments4, we illustrate the performance of n-BHMC (Algorithm 1) to sample from target distributions which are supported on polytopes. We compare our method with the numerical implementation of CRHMC5 provided by Kook et al. (2022a). In all of our settings, we compute g as the Hessian of the logarithmic barrier, see Section 2.2. The algorithms are always initialized at the center of mass of the considered polytope. At each iteration of n-BHMC, we perform one step of numerical integration, using the Störmer-Verlet scheme with K = 30 fixed-point steps and keep the refresh parameter β equal to 1. We refer to Appendix K for more details on the setting of our experiments and additional results. |
| Researcher Affiliation | Academia | Maxence Noble CMAP, CNRS, École polytechnique, Institut Polytechnique de Paris, 91120 Palaiseau, France Valentin De Bortoli Computer Science Department, ENS, CNRS, PSL University Alain Oliviero Durmus CMAP, CNRS, École polytechnique, Institut Polytechnique de Paris, 91120 Palaiseau, France |
| Pseudocode | Yes | Algorithm 1: n-BHMC with Momentum Refresh Input: (X0, P0) T M, β (0, 1], N N, h > 0, η > 0, Φh with domain domΦh Output: (Xn, Pn)n [N] |
| Open Source Code | Yes | Our code: https://github.com/maxencenoble/barrier-hamiltonian-monte-carlo. |
| Open Datasets | Yes | We then consider 10 polytopes given in the COBRA Toolbox v3.0 (Heirendt et al., 2019), which model molecular systems, and follow the method provided in (Kook et al., 2022a, Appendix A) to pre-process them. |
| Dataset Splits | No | The paper describes sampling from distributions and running MCMC algorithms for a number of iterations, but it does not specify training, validation, or test dataset splits in the conventional sense of supervised learning. |
| Hardware Specification | No | The paper does not provide any specific details regarding the hardware specifications (e.g., GPU/CPU models, memory) used to run the experiments. |
| Software Dependencies | No | The paper mentions 'MATLAB implementation of CRHMC' but does not specify the version of MATLAB or any other software dependencies with version numbers. |
| Experiment Setup | Yes | At each iteration of n-BHMC, we perform one step of numerical integration, using the Störmer-Verlet scheme with K = 30 fixed-point steps and keep the refresh parameter β equal to 1. ... We recall that we use an adaptive step-size h in CRHMC and n-BHMC such that we obtain an average acceptance probability of order 0.5 in the MH filter. ... This heuristic results in defining (a) for the hypercube, η = 5 if d = 5 and η = 10 if d = 10 and (b) for the simplex, η = 10 if d = 5 and η = 200 if d = 10. |