Closing the Dequantization Gap: PixelCNN as a Single-Layer Flow
Authors: Didrik Nielsen, Ole Winther
NeurIPS 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Finally, we study multilayer flows composed of Pixel CNNs and non-autoregressive coupling layers and demonstrate state-of-the-art results on CIFAR-10 for flow models trained with dequantization. 6 Experiments |
| Researcher Affiliation | Academia | Didrik Nielsen Technical University of Denmark didni@dtu.dk Ole Winther Technical University of Denmark olwi@dtu.dk |
| Pseudocode | No | No structured pseudocode or algorithm blocks were found in the paper. |
| Open Source Code | Yes | The code used for experiments is publicly available at https://github.com/ didriknielsen/pixelcnn_flow. |
| Open Datasets | Yes | We trained Pixel CNN (van den Oord et al., 2016c) and Pixel CNN++ (Salimans et al., 2017) as flow models on CIFAR-10 |
| Dataset Splits | No | The paper mentions using CIFAR-10, which has standard splits, but does not explicitly provide the train/validation/test split percentages or sample counts within the text. |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., exact GPU/CPU models, processor types, or memory amounts) used for running its experiments. |
| Software Dependencies | No | The paper does not provide specific ancillary software details (e.g., library or solver names with version numbers) needed to replicate the experiment. |
| Experiment Setup | No | The paper describes the architectural composition of models and the dequantization method used, but does not provide specific hyperparameter values (e.g., learning rate, batch size, number of epochs, optimizer settings) or a detailed table/paragraph of training configurations. |