Unveiling the Latent Space Geometry of Push-Forward Generative Models
Authors: Thibaut Issenhuth, Ugo Tanielian, Jeremie Mary, David Picard
ICML 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Through experiments on GANs, we demonstrate the validity of our theoretical results and gain new insights into the latent space geometry of these models. |
| Researcher Affiliation | Collaboration | 1Criteo AI Lab, Paris, France 2LIGM, Ecole des Ponts, Univ Gustave Eiffel, CNRS, Marne-la-Vall ee, France. |
| Pseudocode | No | The paper describes procedures in narrative text and Appendix B.3, but does not include formally labeled pseudocode or algorithm blocks. |
| Open Source Code | Yes | Implementation details are given in Appendix B and code is provided in Supplementary Material.Our code is open-source and can be found there: https://github.com/thibautissenhuth/unveiling latent geometry. |
| Open Datasets | Yes | We believe that the assumption of disconnectedness is a reasonable one, particularly for multi-class datasets such as MNIST (Le Cun et al., 1998), CIFAR10 (Krizhevsky, 2009), or STL10 (Coates et al., 2011). |
| Dataset Splits | No | The paper mentions 100k training points and 10k test points for a specific dataset construction in Section 4.1, and 10k real/fake images for evaluation metrics in Appendix B.1, but does not explicitly provide percentages or counts for a validation set split. |
| Hardware Specification | Yes | For all datasets, the training of GANs was run on NVIDIA Tesla V100 GPUs (16 GB). |
| Software Dependencies | No | The paper mentions using Adam optimizer and deep learning models, but does not specify versions for software libraries or dependencies like PyTorch, TensorFlow, or Python. |
| Experiment Setup | Yes | The batch size is 256. The learning rate of the discriminator is two times larger (Heusel et al., 2017), i.e. 5 10 5 for the generator and 1 10 4 for the discriminator. GANs are trained for 80k steps on MNIST and for 100k steps on CIFAR datasets. Architectures of generator and discriminator are described in Table 4 and Table 5. |