Piecewise deterministic generative models
Authors: Andrea Bertazzi, Dario Shariatian, Umut Simsekli, Eric Moulines, Alain Durmus
NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Promising numerical simulations support further investigations into this class of models. In Section 4 we test our models on simple toy distributions. |
| Researcher Affiliation | Academia | 1 École Polytechnique, Institut Polytechnique de Paris 2 INRIA, CNRS, Ecole Normale Supérieure, PSL Research University 3 MBZUAI |
| Pseudocode | Yes | Algorithm 1: Pseudo-code for the simulation of a homogeneous PDMP |
| Open Source Code | Yes | we provide all the necessary codes to reproduce our experiments. |
| Open Datasets | Yes | We consider the task of generating handwritten digits training the ZZP on the MNIST dataset. |
| Dataset Splits | No | The paper specifies training and test sample counts but does not explicitly mention a validation set or its split details. |
| Hardware Specification | Yes | We run our experiments on 50 Cascade Lake Intel Xeon 5218 16 cores, 2.4GHz. |
| Software Dependencies | No | The paper mentions software like PyTorch, zuko, and Adam by name but does not provide specific version numbers for these dependencies. |
| Experiment Setup | Yes | For each forward PDMP, we take a time horizon Tf equal to 5, and set the refreshment rate λr to 1. The optimiser is Adam [Kingma and Ba, 2015] with learning rate 5e-4 for all neural networks. We use a batch size of 4096 and train our model for 25000 steps. |