Asymptotic Guarantees for Generative Modeling Based on the Smooth Wasserstein Distance
Authors: Ziv Goldfeld, Kristjan Greenewald, Kengo Kato
NeurIPS 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Our theory is supported by empirical results, posing the SWD as a potent tool for learning and inference in high dimensions. [...] Empirical results to support our theory are provided. Using synthetic data we validate both the limiting distributions of parameter estimates and the convergence of the SWD as the number of samples increases. |
| Researcher Affiliation | Collaboration | Ziv Goldfeld Cornell University goldfeld@cornell.edu Kristjan Greenewald MIT-IBM Watson AI Lab kristjan.h.greenewald@ibm.com Kengo Kato Cornell University kk976@cornell.edu |
| Pseudocode | No | The paper does not contain any pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not provide any statements or links indicating that its source code for the described methodology is publicly available. |
| Open Datasets | No | The paper uses 'synthetic data' and describes the generation process ('for a Gaussian mixture (parameterized by the two means, one from each mode)', 'The target distribution is a multivariate standard Gaussian P = Nσ for σ = 1.'), but does not provide a concrete link, DOI, or formal citation for a publicly available dataset. |
| Dataset Splits | No | The paper states that experiments were run with '50 estimation trials' or '50 random trials', but it does not specify explicit dataset splits (e.g., percentages or counts) for training, validation, or testing, as the data is synthetically generated per trial rather than being a fixed dataset. |
| Hardware Specification | No | The paper does not provide any specific details about the hardware (e.g., GPU/CPU models, memory) used for running the experiments. |
| Software Dependencies | No | The paper mentions using 'the NN-based estimator for WGAN-GP discriminator from [29]' but does not provide specific version numbers for any software dependencies. |
| Experiment Setup | Yes | The WGAN-GP discriminator has 3 hidden layers with 512 hidden units each. [...] parameterize Qθ via a three-layer neural network with 256 hidden units per layer. |