A Geometric Analysis of Deep Generative Image Models and Its Applications

Authors: Binxu Wang, Carlos R Ponce

ICLR 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental An empirical analysis of several pretrained GANs shows that image variation around each position is concentrated along surprisingly few major axes (the space is highly anisotropic) and the directions that create this large variation are similar at different positions in the space (the space is homogeneous).
Researcher Affiliation Academia Binxu Wang Department of Neuroscience Washington University in St Louis St Louis, MO, USA binxu.wang@wustl.edu Carlos R. Ponce Department of Neuroscience Washington University in St Louis St Louis, MO, USA crponce@wustl.edu
Pseudocode No The paper provides mathematical formulations and descriptions of methods, but it does not include any clearly labeled 'Pseudocode' or 'Algorithm' blocks.
Open Source Code No The paper mentions obtaining several GAN models from public sources like 'torch hub' and 'Hugging Face's translation of Deep Mind’s TensorFlow implementation' for their analysis, but it does not state that the code for *their own* geometric framework or methods is open-source or provide a link to it.
Open Datasets Yes Progressive Growing GAN (PGGAN) was obtained from torch hub https://pytorch.org/ hub/facebookresearch_pytorch-gan-zoo_pgan/ and we use the 256 pixel version. It s trained on celebrity faces dataset (Celeb A).
Dataset Splits No The paper describes analyzing pre-trained GANs and conducting a human perceptual study, but it does not specify any training, validation, or test dataset splits for its own experimental analysis or the human study data.
Hardware Specification Yes Computation time is measured on a GTX 1060 GPU.
Software Dependencies No The paper mentions software like 'Pytorch', 'TensorFlow', and 'ARPACK' but does not specify their version numbers.
Experiment Setup Yes For each GAN, we randomly sampled 100-1000 z in the latent space, used backpropagation to compute H(z) and then performed the eigendecomposition.