Clone MCMC: Parallel High-Dimensional Gaussian Gibbs Sampling

Authors: Andrei-Cristian Barbos, Francois Caron, Jean-François Giovannelli, Arnaud Doucet

NeurIPS 2017 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We show empirically that our method is very flexible and performs well compared to Hogwild-type algorithms.
Researcher Affiliation Academia Andrei-Cristian B arbos IMS Laboratory Univ. Bordeaux CNRS BINP andbarbos@u-bordeaux.fr François Caron Department of Statistics University of Oxford caron@stats.ox.ac.uk Jean-François Giovannelli IMS Laboratory Univ. Bordeaux CNRS BINP giova@ims-bordeaux.fr Arnaud Doucet Department of Statistics University of Oxford doucet@stats.ox.ac.uk
Pseudocode No No structured pseudocode or algorithm blocks were found in the paper.
Open Source Code No The paper does not provide any explicit statement or link regarding the availability of open-source code for the described methodology.
Open Datasets No The paper describes an application to image inpainting-deconvolution using an 'unobserved image... of size 1000x1000', but does not provide access information (link, DOI, specific citation) for a publicly available dataset.
Dataset Splits No The paper discusses 'burn-in samples' for MCMC, but does not specify dataset splits (e.g., percentages or counts for training, validation, or test sets).
Hardware Specification Yes Experiments are run on GPU with 2688 CUDA cores.
Software Dependencies No The paper does not provide specific software names with version numbers needed to replicate the experiment.
Experiment Setup Yes The tuning parameter η is set to 1. We run our clone MCMC algorithm for ns = 19000 samples, out of which the first 4000 were discarded as burn-in samples, using as initialization the observed image, with missing entries padded with zero. The observation noise is assumed to be independent of X with Σ 1 b = γb I and γb = 10 2.