Neural Inverse Transform Sampler
Authors: Henry Li, Yuval Kluger
ICML 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We demonstrate the applicability of NITS by applying it to realistic, high-dimensional density estimation tasks: likelihood-based generative modeling on the CIFAR-10 dataset, and density estimation on the UCI suite of benchmark datasets, where NITS produces compelling results rivaling or surpassing the state of the art. In this section, we extensively evaluate the capabilities of our proposed approach via density estimation on two vastly different modalities. |
| Researcher Affiliation | Academia | 1Department of Applied Mathematics, Yale University, New Haven, CT. Correspondence to: Henry Li <henry.li@yale.edu>. |
| Pseudocode | Yes | Algorithm 1 Sampling from Pθ |
| Open Source Code | Yes | Code available at github.com/lihenryhfl/NITS. |
| Open Datasets | Yes | likelihood-based generative modeling on the CIFAR-10 dataset, and density estimation on the UCI suite of benchmark datasets. We additionally evaluate our method against the UCI suite of density estimation benchmarks. |
| Dataset Splits | No | The paper mentions 'TRAIN SET SIZE' for some datasets and implies the use of a validation set for early stopping, but it does not provide specific split percentages or absolute counts for train/validation/test splits, nor does it cite a standard split with precise details. |
| Hardware Specification | Yes | All models were trained on a Nvidia RTX 2080 Ti GPU. |
| Software Dependencies | No | The paper does not specify software dependencies with version numbers (e.g., Python, PyTorch, CUDA versions). |
| Experiment Setup | Yes | Models were trained with early stopping with a patience of 5 epochs, and a learning rate of 2e-4. HIDDEN DIM 1024 1024 512 128 1024 DROPOUT 0.2 0.2 0.3 0.1 0.2 RESIDUAL BLOCKS 4 4 4 8 4 |