Semi-Separable Hamiltonian Monte Carlo for Inference in Bayesian Hierarchical Models
Authors: Yichuan Zhang, Charles Sutton
NeurIPS 2014 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | In this section, we compare the performance of SSHMC with the standard HMC and RMHMC within Gibbs [7] on four benchmark models. The performance is evaluated by the minimum Effective Sample Size (ESS) over all dimensions (see [6]). |
| Researcher Affiliation | Academia | Yichuan Zhang School of Informatics University of Edinburgh Y.Zhang-60@sms.ed.ac.uk Charles Sutton School of Informatics University of Edinburgh c.sutton@inf.ed.ac.uk |
| Pseudocode | Yes | Algorithm 1 SSHMC by ABLA |
| Open Source Code | No | The paper does not provide an explicit statement about the release of source code for the described methodology, nor does it include a link to a code repository. |
| Open Datasets | Yes | We use the Statlog (German credit) dataset from [1]. [1] K. Bache and M. Lichman. UCI machine learning repository, 2013. URL http://archive.ics. uci.edu/ml. |
| Dataset Splits | No | The paper discusses tuning parameters like step size and number of leapfrog steps, but it does not specify explicit training, validation, or test dataset splits in terms of percentages or counts for reproducibility. |
| Hardware Specification | No | The paper does not provide specific hardware details such as GPU/CPU models, processor types, or memory used for running the experiments. It does not mention any specific computing environments beyond general descriptions. |
| Software Dependencies | No | The paper does not provide specific software dependency details, such as library names with version numbers (e.g., Python 3.x, PyTorch 1.x). |
| Experiment Setup | Yes | The step size of all methods are manually tuned so that the acceptance rate is around 70-85%. The number of leapfrog steps are tuned for each method using preliminary runs. We use 2 leapfrog steps for low-level parameters and 1 leapfrog step for the hyperparameter in ABLA and the same leapfrog step size for the two separable Hamiltonians. |