A Complete Recipe for Stochastic Gradient MCMC

Authors: Yi-An Ma, Tianqi Chen, Emily Fox

NeurIPS 2015 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Our experiments on simulated data and a streaming Wikipedia analysis demonstrate that the proposed SGRHMC sampler inherits the benefits of Riemann HMC, with the scalability of stochastic gradient methods.
Researcher Affiliation Academia Yi-An Ma, Tianqi Chen, and Emily B. Fox University of Washington {yianma@u,tqchen@cs,ebfox@stat}.washington.edu
Pseudocode Yes Algorithm 1: Generalized Stochastic Gradient Riemann Hamiltonian Monte Carlo
Open Source Code No The paper does not provide a link to open-source code or explicitly state that the code for the methodology described is available.
Open Datasets No The paper mentions "a streaming Wikipedia analysis using latent Dirichlet allocation" and "latent Dirichlet allocation (LDA) model on a large Wikipedia dataset". While Wikipedia is public, no specific link, DOI, or formal citation for the dataset used for their analysis is provided to ensure reproducibility.
Dataset Splits No The paper mentions "simulated data" and a "streaming Wikipedia analysis" but does not specify any training, validation, or test dataset splits (e.g., percentages or sample counts) for these experiments.
Hardware Specification No The paper does not provide any specific details about the hardware (e.g., GPU/CPU models, memory) used to run the experiments.
Software Dependencies No The paper does not specify any software dependencies with version numbers (e.g., programming languages, libraries, or specific frameworks) used in the experiments.
Experiment Setup No The paper states: "The Supplement contains details on the specific samplers considered and the parameter settings used in these experiments." However, no specific hyperparameters or detailed training configurations are provided within the main body of the paper.