Learning in Integer Latent Variable Models with Nested Automatic Differentiation
Authors: Daniel Sheldon, Kevin Winner, Debora Sujono
ICML 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We conducted experiments on the accuracy, speed, and stability of AD inference algorithms for integer HMMs, and of parameter estimation using AD gradient algorithms. |
| Researcher Affiliation | Academia | 1College of Information and Computer Sciences, University of Massachusetts Amherst 2Department of Computer Science, Mount Holyoke College. |
| Pseudocode | Yes | Algorithm 1 Basic computation model... Algorithm 6 Lifted Nested Derivative L' |
| Open Source Code | Yes | We show the full 15-line Python implementation in the supplementary material. |
| Open Datasets | No | The paper states 'We generated data from integer HMMs...' and 'We generate data from an integer HMM...'. This indicates the data was simulated by the authors and there is no mention of a publicly available dataset or access information for the generated data. |
| Dataset Splits | No | The paper uses simulated data and does not describe standard train/validation/test dataset splits. It generates data from integer HMMs for specific experiments. |
| Hardware Specification | No | The paper does not provide specific details about the hardware used for experiments, such as CPU or GPU models, or memory specifications. |
| Software Dependencies | No | The paper mentions a 'Python implementation' but does not specify the version of Python or any other software dependencies with their version numbers. |
| Experiment Setup | Yes | We generated data with immigration distribution mk ~ Poisson(lambda) for increasing lambda, and fixed offspring distributions zk,i ~ Bernoulli(0.5) or zk,i ~ Poisson(0.5). |