Invertibility of Convolutional Generative Networks from Partial Measurements

Authors: Fangchang Ma, Ulas Ayaz, Sertac Karaman

NeurIPS 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We further demonstrate, empirically, that the same conclusion extends to networks with multiple layers, other activation functions (leaky Re LU, sigmoid and tanh), and weights trained on real datasets.
Researcher Affiliation Collaboration Fangchang Ma* MIT fcma@mit.edu Ulas Ayaz MIT uayaz@mit.edu uayaz@lyft.com Sertac Karaman MIT sertac@mit.edu Both authors contributed equally to this work. Ulas Ayaz is presently affiliated with Lyft, Inc.
Pseudocode No The paper does not contain any structured pseudocode or algorithm blocks.
Open Source Code Yes The sample code is available at https://github.com/fangchangma/invert-generative-networks.
Open Datasets Yes We rescale the raw grayscale images from the MNIST dataset [11] to size of 32 ˆ 32. A similar study is conducted on a generative network trained on the Celeb Faces [12] dataset.
Dataset Splits No The paper does not explicitly specify training, validation, or test dataset splits or percentages. It only mentions the datasets used and the training framework.
Hardware Specification No The paper does not provide specific details about the hardware used to run the experiments (e.g., GPU models, CPU types).
Software Dependencies No The paper mentions using "conditional deep convolutional generative adversarial networks (DCGAN) framework" and "Adam with learn rate 0.1" but does not specify version numbers for any software or libraries.
Experiment Setup Yes The first layer has 16 channels and the second layer has 1 single channel. Both layers have a kernel size of 5 and a stride of 3. We used Adam with learn rate 0.1 to optimize the latent code z . The optimization process usually converges within 500 iterations. The input noise to the generator is set to have a relatively small dimension 10 to ensure a sufficiently expanding network.