Latent Ordinary Differential Equations for Irregularly-Sampled Time Series

Authors: Yulia Rubanova, Ricky T. Q. Chen, David K. Duvenaud

NeurIPS 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We show experimentally that these ODE-based models outperform their RNN-based counterparts on irregularly-sampled data. 4 Experiments 4.1 Toy dataset 4.2 Quantitative Evaluation 4.3 Mu Jo Co Physics Simulation 4.4 Physionet 4.5 Human Activity dataset
Researcher Affiliation Academia Yulia Rubanova, Ricky T. Q. Chen, David Duvenaud University of Toronto and the Vector Institute {rubanova, rtqichen, duvenaud}@cs.toronto.edu
Pseudocode Yes Algorithm 1 The ODE-RNN.
Open Source Code No The paper does not contain any explicit statement about releasing source code or a link to a code repository.
Open Datasets Yes We evaluated our model on the Physio Net Challenge 2012 dataset [Silva et al., 2012]
Dataset Splits Yes On each dataset, we used 80% for training and 20% for test.
Hardware Specification No The paper states: 'We thank the Vector Institute for providing computational resources.' However, this does not provide specific hardware details (e.g., CPU/GPU models, memory, cluster specifications).
Software Dependencies No The paper mentions using GRU, but does not provide specific version numbers for any software dependencies (e.g., Python, PyTorch, TensorFlow, specific ODE solvers).
Experiment Setup No The paper refers to supplementary material for details: 'See supplement for more details on hyperparameters.' The main text does not contain specific experimental setup details like hyperparameter values or training configurations.