Flat Metric Minimization with Applications in Generative Modeling
Authors: Thomas Möllenhoff, Daniel Cremers
ICML 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | In experiments, we show that the proposed shift to k > 0 leads to interpretable and disentangled latent representations which behave equivariantly to the specified oriented tangent planes. |
| Researcher Affiliation | Academia | 1Department of Informatics, Technical University of Munich, Garching, Germany. |
| Pseudocode | No | The paper describes mathematical formulations and implementation details, but does not include a clearly labeled pseudocode or algorithm block. |
| Open Source Code | Yes | See https://github.com/moellenh/flatgan for a PyTorch implementation to reproduce Fig. 6 and Fig. 7. |
| Open Datasets | Yes | For MNIST, we compute the tangent vectors manually by rotation and dilation of the digits, similar as done by Simard et al. (1992; 1998). For the small NORB example, the tangent vectors are given as differences between the corresponding images. As observed in the figures, the proposed formulation leads to interpretable latent codes. |
| Dataset Splits | No | The paper mentions using datasets like MNIST, small NORB, and tinyvideos but does not provide specific details on training, validation, or test splits (e.g., percentages, sample counts, or explicit cross-validation setup). |
| Hardware Specification | No | The paper does not provide any specific hardware details such as CPU/GPU models, memory, or cloud instance types used for running experiments. |
| Software Dependencies | No | The paper mentions 'PyTorch implementation' but does not specify version numbers for PyTorch or any other software dependencies. |
| Experiment Setup | Yes | The specific hyperparameters, architectures and tangent vector setups used in practice3 are detailed in Appendix B. |