Unconstrained Monotonic Neural Networks
Authors: Antoine Wehenkel, Gilles Louppe
NeurIPS 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | In this section, we evaluate the expressiveness of UMNN-MAF on a variety of density estimation benchmarks, as well as for approximate inference in variational auto-encoders. The source code to reproduce our experiments will be made available on Github at the end of the reviewing process. Experiments were carried out using the same integrand neural network in the UMNN component i.e., in Equation 8, f i = f with shared weights ψi = ψ for i {1, . . . , d}. |
| Researcher Affiliation | Academia | Antoine Wehenkel University of Liège Gilles Louppe University of Liège |
| Pseudocode | Yes | We provide the pseudo-code of the forward and backward passes using Clenshaw-Curtis quadrature in Appendix B. |
| Open Source Code | No | The source code to reproduce our experiments will be made available on Github at the end of the reviewing process. |
| Open Datasets | Yes | We carry out experiments on tabular datasets (POWER, GAS, HEPMASS, MINIBOONE, BSDS300) as well as on MNIST. We follow the experimental protocol of Papamakarios et al. [2017]. |
| Dataset Splits | Yes | For each dataset, we report results on test data for our best performing model (selected on the validation data). |
| Hardware Specification | No | No specific hardware details (e.g., CPU/GPU models, memory) used for running the experiments are mentioned in the paper. |
| Software Dependencies | No | No specific software dependencies or their version numbers (e.g., Python, PyTorch, CUDA versions) are mentioned in the paper. |
| Experiment Setup | Yes | All training hyper-parameters and architectural details are given in Appendix A.1. |