Automatic Posterior Transformation for Likelihood-Free Inference
Authors: David Greenberg, Marcel Nonnenmacher, Jakob Macke
ICML 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We compare APT to SNPE-A, SNPE-B and SNL on several problems (implementation details in A.5). |
| Researcher Affiliation | Academia | 1Computational Neuroengineering, Department of Electrical and Computer Engineering, Technical University of Munich, Munich, Germany. |
| Pseudocode | Yes | Algorithm 1 APT with per-round proposal updates |
| Open Source Code | Yes | Code available at github.com/mackelab/delfi. |
| Open Datasets | No | The paper uses various simulators (e.g., two-moons, SLCP, Lotka-Volterra) to generate data for experiments, but does not provide specific access information (link, DOI, formal citation) for a publicly available or open dataset. |
| Dataset Splits | No | The paper describes generating simulations in rounds but does not specify exact training, validation, or test split percentages, sample counts, or citations to predefined splits for reproducibility. |
| Hardware Specification | No | The paper does not provide specific hardware details such as exact GPU or CPU models used for running the experiments. |
| Software Dependencies | No | The paper mentions 'implemented in PyTorch' but does not provide specific version numbers for PyTorch or other software dependencies. |
| Experiment Setup | Yes | All models were implemented in PyTorch (Paszke et al., 2017) and trained using Adam (Kingma & Ba, 2014) with an initial learning rate of 10 3 which was decayed by a factor of 10 if the validation loss did not decrease for 20 epochs. |