Pathwise Derivatives Beyond the Reparameterization Trick
Authors: Martin Jankowiak, Fritz Obermeyer
ICML 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We demonstrate with a variety of synthetic experiments and stochastic variational inference tasks that our pathwise gradients are competitive with other methods. |
| Researcher Affiliation | Industry | 1Uber AI Labs, San Francisco, USA. Correspondence to: <jankowiak@uber.com>, <fritzo@uber.com>. |
| Pseudocode | No | The paper does not contain structured pseudocode or algorithm blocks. |
| Open Source Code | Yes | Our approximations for pathwise gradients for the Gamma, Beta, and Dirichlet distributions are available in the 0.4 release of Py Torch (Paszke et al., 2017). |
| Open Datasets | Yes | The dataset we consider is the Olivetti faces dataset,16 which consists of 64 64 grayscale images of human faces. 16http://www.cl.cam.ac.uk/research/dtg/ attarchive/facedatabase.html |
| Dataset Splits | No | The paper describes datasets but does not provide specific information about training, validation, or test splits. No explicit mention of a validation set or its size is made. |
| Hardware Specification | No | The paper does not provide any specific hardware details such as CPU, GPU models, or memory specifications used for running its experiments. |
| Software Dependencies | Yes | Our approximations for pathwise gradients for the Gamma, Beta, and Dirichlet distributions are available in the 0.4 release of Py Torch (Paszke et al., 2017). |
| Experiment Setup | No | The paper provides some model-specific parameters but lacks concrete experimental setup details such as learning rates, batch sizes, number of epochs, or optimizer settings in the main text. |