Stochastic variational inference for hidden Markov models
Authors: Nicholas Foti, Jason Xu, Dillon Laird, Emily B. Fox
NeurIPS 2014 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We demonstrate the effectiveness of our algorithm on synthetic experiments and a large genomics dataset where a batch algorithm is computationally infeasible. |
| Researcher Affiliation | Academia | Nicholas J. Foti , Jason Xu , Dillon Laird, and Emily B. Fox University of Washington {nfoti@stat,jasonxu@stat,dillonl2@cs,ebfox@stat}.washington.edu |
| Pseudocode | Yes | Algorithm 1 Stochastic Variational Inference for HMMs (SVIHMM) and Algorithm 2 Grow Buf procedure. |
| Open Source Code | No | The paper does not explicitly state that source code is provided or offer a link to a repository. |
| Open Datasets | Yes | We apply the SVIHMM algorithm to a massive human chromatin dataset provided by the ENCODE project [24]. [24] ENCODE Project Consortium. An integrated encyclopedia of DNA elements in the human genome. Nature, 489(7414):57 74, September 2012. |
| Dataset Splits | Yes | In Fig. 1(b), we see similar trends in terms of predictive log-probability holding out 10% of the observations as a test set and using 5-fold cross validation. |
| Hardware Specification | No | The paper does not specify the exact hardware (e.g., CPU/GPU models, memory) used for running the experiments. |
| Software Dependencies | No | The paper does not specify software dependencies with version numbers. |
| Experiment Setup | Yes | For each per parameter setting, we ran 20 random restarts of SVIHMM for 100 iterations and batch VB until convergence of the ELBO. A forgetting rate κ parametrizes step sizes ρn = (1 + n) κ. We fix the total number of observations L M used per iteration of SVIHMM such that increasing M implies decreasing L (and vice versa). |