Nested Variational Inference
Authors: Heiko Zimmermann, Hao Wu, Babak Esmaeili, Jan-Willem van de Meent
NeurIPS 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Our experiments apply NVI to (a) sample from a multimodal distribution using a learned annealing path (b) learn heuristics that approximate the likelihood of future observations in a hidden Markov model and (c) to perform amortized inference in hierarchical deep generative models. |
| Researcher Affiliation | Academia | Institute of Informatics, University of Amsterdam Khoury College of Computer Sciences, Northeastern University |
| Pseudocode | No | The paper describes algorithms and methods using mathematical equations and textual descriptions, but it does not include any explicitly labeled pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not contain any statement about releasing open-source code or provide a link to a code repository. |
| Open Datasets | Yes | We evaluate NVI for the BGMM-VAE using the following procedure. We generate mini-batches with a sampled λ (for which we make use of class labels that are not provided to the model). ... BGMM-VAE trained on MNIST & Fashion MNIST |
| Dataset Splits | No | The paper does not specify the exact percentages or counts for training, validation, or test splits. It refers to 'test mini-batch' and 'test instances' but lacks detailed split information. |
| Hardware Specification | No | The paper does not provide specific details about the hardware used for running the experiments (e.g., GPU models, CPU types, memory). |
| Software Dependencies | No | The paper does not list specific software dependencies with version numbers (e.g., Python 3.x, PyTorch 1.x). |
| Experiment Setup | No | The paper describes the methods and tasks, but does not provide specific details on hyperparameters, optimizer settings, or other explicit training configurations in the main text. |