Notice: The reproducibility variables underlying each score are classified using an automated LLM-based pipeline, validated against a manually labeled dataset. LLM-based classification introduces uncertainty and potential bias; scores should be interpreted as estimates. Full accuracy metrics and methodology are described in [1].
Asymptotic Guarantees for Generative Modeling Based on the Smooth Wasserstein Distance
Authors: Ziv Goldfeld, Kristjan Greenewald, Kengo Kato
NeurIPS 2020 | Venue PDF | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Our theory is supported by empirical results, posing the SWD as a potent tool for learning and inference in high dimensions. [...] Empirical results to support our theory are provided. Using synthetic data we validate both the limiting distributions of parameter estimates and the convergence of the SWD as the number of samples increases. |
| Researcher Affiliation | Collaboration | Ziv Goldfeld Cornell University EMAIL Kristjan Greenewald MIT-IBM Watson AI Lab EMAIL Kengo Kato Cornell University EMAIL |
| Pseudocode | No | The paper does not contain any pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not provide any statements or links indicating that its source code for the described methodology is publicly available. |
| Open Datasets | No | The paper uses 'synthetic data' and describes the generation process ('for a Gaussian mixture (parameterized by the two means, one from each mode)', 'The target distribution is a multivariate standard Gaussian P = NĪ for Ī = 1.'), but does not provide a concrete link, DOI, or formal citation for a publicly available dataset. |
| Dataset Splits | No | The paper states that experiments were run with '50 estimation trials' or '50 random trials', but it does not specify explicit dataset splits (e.g., percentages or counts) for training, validation, or testing, as the data is synthetically generated per trial rather than being a fixed dataset. |
| Hardware Specification | No | The paper does not provide any specific details about the hardware (e.g., GPU/CPU models, memory) used for running the experiments. |
| Software Dependencies | No | The paper mentions using 'the NN-based estimator for WGAN-GP discriminator from [29]' but does not provide specific version numbers for any software dependencies. |
| Experiment Setup | Yes | The WGAN-GP discriminator has 3 hidden layers with 512 hidden units each. [...] parameterize Qθ via a three-layer neural network with 256 hidden units per layer. |