Constructive Universal High-Dimensional Distribution Generation through Deep ReLU Networks
Authors: Dmytro Perekrestenko, Stephan Müller, Helmut Bölcskei
ICML 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | We present an explicit deep neural network construction that transforms uniformly distributed one-dimensional noise into an arbitrarily close approximation of any two-dimensional Lipschitz-continuous target distribution. The key ingredient of our design is a generalization of the space-filling property of sawtooth functions discovered in (Bailey & Telgarsky, 2018). We elicit the importance of depth in our neural network construction in driving the Wasserstein distance between the target distribution and the approximation realized by the network to zero. An extension to output distributions of arbitrary dimension is outlined. Finally, we show that the proposed construction does not incur a cost in terms of error measured in Wassersteindistance relative to generating d-dimensional target distributions from d independent random variables. |
| Researcher Affiliation | Academia | 1Department of Information Technology and Electrical Engineering, ETH Z urich, Z urich, Switzerland 2Department of Mathematics, ETH Z urich, Z urich, Switzerland. Correspondence to: Dmytro Perekrestenko <pdmytro@mins.ee.ethz.ch>. |
| Pseudocode | No | The paper does not contain any pseudocode or clearly labeled algorithm blocks. |
| Open Source Code | No | The paper does not provide any statement or link regarding the open-sourcing of the code for the described methodology. |
| Open Datasets | No | The paper is theoretical and focuses on mathematical constructions for distribution generation. It does not use or refer to any publicly available or open datasets for training. |
| Dataset Splits | No | The paper is theoretical and does not involve experimental validation on datasets, thus no dataset splits for training, validation, or testing are provided. |
| Hardware Specification | No | The paper is theoretical and does not mention any hardware specifications used for experiments. |
| Software Dependencies | No | The paper is theoretical and does not mention any software dependencies with specific version numbers. |
| Experiment Setup | No | The paper is theoretical and does not describe an experimental setup with specific hyperparameters or training configurations. |