Generative Well-intentioned Networks
Authors: Justin Cosentino, Jun Zhu
NeurIPS 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We then empirically evaluate the effectiveness of the proposed framework in Section 5. Lastly, we discuss related works in Section 6.We evaluate the WGWIN-GP using the training procedure outlined in Section 4 and the inference method illustrated in Figure 1. |
| Researcher Affiliation | Academia | Justin Cosentino, Jun Zhu Dept. of Comp. Sci. & Tech., Institute for AI, THBI Lab, BNRist Center, State Key Lab for Intell. Tech. & Sys., Tsinghua University, Beijing, China |
| Pseudocode | Yes | Algorithm 1: WGWIN with gradient and transformation penalty. |
| Open Source Code | No | The paper does not provide an explicit statement about releasing code or a link to a code repository for the described methodology. |
| Open Datasets | Yes | We use two different datasets in our experiments: the MNIST handwritten digits [23] dataset and the Fashion-MNIST clothing dataset [41]. |
| Dataset Splits | Yes | We further split both training sets into a 50,000 image training set and 10,000 image validation set. |
| Hardware Specification | Yes | We trained and evaluated the models using NVIDIA Ge Force GTX TITAN X GPUs. |
| Software Dependencies | No | The network is implemented using Tensor Flow Probability [7]. No specific version numbers for software dependencies are provided. |
| Experiment Setup | Yes | The BNN trained for 30 epochs using a learning rate of 0.001 and batch size of 128. The GWIN trained for 200,000 iterations using the default hyperparameters listed in Algorithm 1. Both the generator and critic used a learning rate of 0.0001 and batch size of 128. |