Evaluating the Implicit Midpoint Integrator for Riemannian Hamiltonian Monte Carlo
Authors: James Brofos, Roy R Lederman
ICML 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Empirically, we find that while leapfrog iterations are faster, the implicit midpoint integrator has better energy conservation, leading to higher acceptance rates, as well as better conservation of volume and better reversibility, arguably yielding a more accurate sampling procedure. and 4. Experimental Results |
| Researcher Affiliation | Academia | Department of Statistics and Data Science, Yale University. Correspondence to: James A. Brofos <james.brofos@yale.edu> |
| Pseudocode | Yes | Algorithm 1 (Fixed Point Iteration) Procedure for solving the equation z = f(z) via fixed point interation to a given tolerance. and Algorithm 2 (G.L.F.(a)) The procedure for a single step of integrating Hamiltonian dynamics using the generalized leapfrog integrator. and Algorithm 3 (I.M.(a)) The procedure for a single step of integrating Hamiltonian dynamics using the implicit midpoint integrator. |
| Open Source Code | Yes | Code for our experiments can be found at https://github.com/James Brofos/ Evaluating-the-Implicit-Midpoint-Integrator. |
| Open Datasets | No | The paper uses distributions like the ‘banana-shaped distribution’ and ‘Neal’s funnel distribution’ and generates synthetic observations or defines models, but does not provide concrete access (link, citation, repository) to a pre-existing publicly available dataset. |
| Dataset Splits | No | The paper describes experimental setups but does not explicitly state specific training, validation, and test dataset splits (e.g., percentages, counts, or predefined citations) required for reproduction. |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., GPU/CPU models, memory amounts, or detailed computer specifications) used for running its experiments. |
| Software Dependencies | No | We implemented all methods in 64-bit precision using Num Py and Sci Py (Harris et al., 2020; Virtanen et al., 2020). This mentions software but lacks specific version numbers for reproducibility. |
| Experiment Setup | Yes | We consider two step-sizes {0.01, 0.1} and a number of integration steps in {5, 10, 50}. We attempt to draw 10,000 samples from the posterior. Each of these configurations is replicated ten times. and To define a stopping condition for the fixed point iterations used by the implicit midpoint and generalized leapfrog methods, we demand that the change in each coordinate be less than a threshold; we let δ 1 10 9, 1 10 6, 1 10 3 when considering reversibility and volume preservation. When reporting performance metrics such as effective sample size, we report results corresponding to a threshold of δ = 1 10 6. |