Learning interpretable continuous-time models of latent stochastic dynamical systems

Authors: Lea Duncker, Gergo Bohner, Julien Boussard, Maneesh Sahani

ICML 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We demonstrate our approach on simulated data from different nonlinear dynamical systems. (Abstract) and In this section, we apply our algorithm to data generated from different nonlinear dynamical systems. (Section 5, Experiments)
Researcher Affiliation Academia 1Gatsby Computational Neuroscience Unit, University College London, London, United Kingdom 2Stanford University, Palo Alto, California, USA.
Pseudocode No The paper describes its algorithms and equations, but does not include structured pseudocode or algorithm blocks.
Open Source Code No The paper does not contain any statement about releasing source code or provide links to a code repository.
Open Datasets No The paper uses simulated data from known dynamical systems, but does not provide concrete access information (link, DOI, repository, or citation) for the specific simulated datasets used in their experiments.
Dataset Splits No The paper describes how data was simulated (e.g., '20 trials'), but does not specify explicit training, validation, or test dataset splits in terms of percentages or counts for a defined dataset.
Hardware Specification No The paper does not provide any specific hardware details such as CPU/GPU models or memory specifications used for running its experiments.
Software Dependencies No The paper mentions using the 'forward Euler method' for solving ODEs but does not specify any software names with version numbers for its implementation or other dependencies.
Experiment Setup Yes In all experiments, we choose an exponentiated quadratic covariance function in the prior over the dynamics f and initialise the inducing point means and Jacobian matrices at zero. Each fixed point observation s uncertainty is initialised with a standard deviation of 0.1. We generate C and d by drawing their entries from Gaussian distributions unless otherwise stated, and initialise our algorithm at these parameter values. For inference, we solve the ODEs (19)-(22) using the forward Euler method with t = 1ms. Unless stated otherwise, the link function is the indentity g(z) = z.