From Variational to Deterministic Autoencoders
Authors: Partha Ghosh, Mehdi S. M. Sajjadi, Antonio Vergari, Michael Black, Bernhard Scholkopf
ICLR 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We show, in a rigorous empirical study, that the proposed regularized deterministic autoencoders are able to generate samples that are comparable to, or better than, those of VAEs and more powerful alternatives when applied to images as well as to structured data such as molecules. 1 |
| Researcher Affiliation | Academia | Max Planck Institute for Intelligent Systems, T ubingen, Germany {pghosh,msajjadi,black,bs}@tue.mpg.de University of California, Los Angeles, USA aver@cs.ucla.edu |
| Pseudocode | No | The paper does not contain any structured pseudocode or algorithm blocks. |
| Open Source Code | Yes | 1An implementation is available at: https://github.com/Partha Eth/Regularized_ autoencoders-RAE- |
| Open Datasets | Yes | MNIST (Le Cun et al., 1998), CIFAR-10 (Krizhevsky & Hinton, 2009) and Celeb A (Liu et al., 2015). |
| Dataset Splits | Yes | We use the official train, validation and test splits of Celeb A. For MNIST and CIFAR, we set aside 10k train samples for validation. |
| Hardware Specification | No | The paper does not provide specific hardware details such as GPU/CPU models or cloud instance types used for experiments. |
| Software Dependencies | No | The paper mentions that they 'follow the models adopted by Tolstikhin et al. (2017)', but does not list specific software components with version numbers (e.g., Python, PyTorch, TensorFlow versions). |
| Experiment Setup | Yes | For all experiments, we use the Adam optimizer with a starting learning rate of 10^-3 which is cut in half every time the validation loss plateaus. All models are trained for a maximum of 100 epochs on MNIST and CIFAR and 70 epochs on Celeb A. We use a mini-batch size of 100 and pad MNIST digits with zeros to make the size 32x32. |