Quasi-Monte Carlo Variational Inference
Authors: Alexander Buchholz, Florian Wenzel, Stephan Mandt
ICML 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We furthermore provide theoretical guarantees on qmc for Monte Carlo objectives that go beyond mcvi, and support our findings by several experiments on large-scale data sets from various domains. |
| Researcher Affiliation | Collaboration | 1ENSAE-CREST, Paris 2TU Kaiserslautern, Germany 3Disney Research, Los Angeles, USA. |
| Pseudocode | Yes | Algorithm 1: Quasi-Monte Carlo Variational Inference |
| Open Source Code | No | In Appendix D we show how our approach can be easily implemented in your existing code. |
| Open Datasets | Yes | As in (Miller et al., 2017), we apply this model to the frisk data set (Gelman et al., 2006) that contains information on the number of stop-and-frisk events within different ethnicity groups. |
| Dataset Splits | No | The paper describes the datasets used (e.g., 100-row subsample of wine dataset) but does not provide specific split information like percentages or counts for training, validation, or test sets. |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., GPU/CPU models, memory amounts) used for running the experiments. |
| Software Dependencies | No | The paper mentions software like 'randtoolbox' (Christophe and Petr, 2015) and 'Adam optimizer' (Kingma and Ba, 2015) but does not provide specific version numbers for these or other software dependencies. |
| Experiment Setup | Yes | In the first three experiments we optimize the elbo using the Adam optimizer (Kingma and Ba, 2015) with the initial step size set to 0.1, unless otherwise stated. ... For the score function estimator, we set the initial step size of Adam to 0.01. |