Importance Weighting and Variational Inference
Authors: Justin Domke, Daniel R. Sheldon
NeurIPS 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | All the following experiments compare E-IWVI using student T distributions to IWVI using Gaussians. |
| Researcher Affiliation | Academia | 1 College of Information and Computer Sciences, University of Massachusetts Amherst 2 Department of Computer Science, Mount Holyoke College |
| Pseudocode | Yes | Algorithm 1 A generative process for q M(z1:M) |
| Open Source Code | No | The paper does not provide an unambiguous statement or a link to open-source code for the methodology described. |
| Open Datasets | Yes | From top: Madelon (d = 500) Sonar (d = 60), Mushrooms (d = 112). |
| Dataset Splits | No | The paper does not provide specific dataset split information (exact percentages, sample counts, or detailed splitting methodology) needed to reproduce the data partitioning. |
| Hardware Specification | No | The paper does not provide specific hardware details (exact GPU/CPU models, processor types, or memory amounts) used for running its experiments. |
| Software Dependencies | Yes | Stan Development Team. Modeling language user s guide and reference manual, version 2.17.0 |
| Experiment Setup | Yes | On these, we used a fixed set of 10, 000 M random inputs to T and optimized using batch L-BFGS, avoiding heuristic tuning of a learning rate sequence. Finally, we considered a (non-conjugate) logistic regression model with a Cauchy prior with a scale of 10, using stochastic gradient descent with various step sizes. |