Stochastic Bouncy Particle Sampler

Authors: Ari Pakman, Dar Gilboa, David Carlson, Liam Paninski

ICML 2017 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We illustrate these ideas in several examples which outperform previous approaches. [...] 7. Experiments
Researcher Affiliation Academia 1Statistics Department and Grossman Center for the Statistics of Mind, Columbia University, New York, NY 10027, USA 2Duke University, Durham, NC 27708, USA.
Pseudocode Yes Algorithm 1 Bouncy Particle Sampler
Open Source Code Yes p SBPS code at https://github.com/dargilboa/SBPS-public.
Open Datasets Yes This architecture was trained on the MNIST dataset. [...] To generate the data, we sampled the components of the true w Rd from Unif[ 5, 5] and N data points {xi} from a d-dimensional zero-mean Gaussian...
Dataset Splits No The paper describes the 'training set size was N = 8340' for MNIST, but does not provide specific train/validation/test dataset splits, percentages, or explicit methodology for partitioning data for validation purposes.
Hardware Specification No The paper does not provide specific hardware details such as GPU or CPU models, memory specifications, or cloud computing instance types used for experiments. It mentions running 'experiments' but no hardware specifics.
Software Dependencies No The paper does not specify software dependencies with version numbers (e.g., Python 3.x, PyTorch 1.x, or specific library versions).
Experiment Setup Yes Figure 4 shows results for N = 1000, d = 20, k = 3, n = 100. [...] Mini-batch size was n = 500 for all methods.