Reflection, Refraction, and Hamiltonian Monte Carlo
Authors: Hadi Mohasel Afshar, Justin Domke
NeurIPS 2015 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experiments show that by reducing the number of rejected samples, this method improves on traditional HMC. |
| Researcher Affiliation | Academia | Hadi Mohasel Afshar Research School of Computer Science Australian National University Canberra, ACT 0200 hadi.afshar@anu.edu.au Justin Domke National ICT Australia (NICTA) & Australian National University Canberra, ACT 0200 Justin.Domke@nicta.com.au |
| Pseudocode | Yes | Algorithm 1: BASELINE & REFLECTIVE HMC ALGORITHMS |
| Open Source Code | No | The paper does not provide any explicit statements about the availability of open-source code or links to a code repository. |
| Open Datasets | No | The comparison takes place on a heavy tail piecewise model with (non-normalized) negative log probability... (18). This implies a synthetic model/distribution is used, not an existing publicly available dataset. |
| Dataset Splits | No | The paper does not specify exact percentages or sample counts for training, validation, or test splits. It describes running Markov chains and evaluating WMAE. |
| Hardware Specification | No | All algorithms are implemented in java and run on a single thread of a 3.40GHz CPU. This CPU description is too general and does not provide a specific model number or full hardware specification. |
| Software Dependencies | No | All algorithms are implemented in java. No specific version of Java or any other software dependencies with version numbers are mentioned. |
| Experiment Setup | Yes | The baseline HMC and RHMC number of steps L and step size ϵ are chosen to be 100 and 0.1 respectively. ... We use a diagonal matrix for A where, for each repetition, each entry on the main diagonal is either exp( 5) or exp(5) with equal probabilities. |