Attentive State-Space Modeling of Disease Progression

Authors: Ahmed M. Alaa, Mihaela van der Schaar

NeurIPS 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experiments on data from the UK Cystic Fibrosis registry show that our model demonstrates superior predictive accuracy, in addition to providing insights into disease progression dynamic.
Researcher Affiliation Academia Ahmed M. Alaa ECE Department UCLA ahmedmalaa@ucla.edu Mihaela van der Schaar UCLA, University of Cambridge, and Alan Turing Institute {mv472@cam.ac.uk,mihaela@ee.ucla.edu}
Pseudocode No The paper describes its methods in prose and with figures (e.g., Figure 2, Figure 3) but does not include any explicitly labeled pseudocode or algorithm blocks.
Open Source Code Yes We implemented our model using Tensorflow1. The code is provided at https://bitbucket.org/mvdschaar/mlforhealthlabpub.
Open Datasets Yes We used data from a cohort of patients enrolled in the UK CF registry, a database held by the UK CF trust2." and footnote "2https://www.cysticfibrosis.org.uk/the-work-we-do/uk-cf-registry/
Dataset Splits Yes All prediction results reported in this Section where obtained via 5-fold cross-validation.
Hardware Specification No The paper does not provide specific details about the hardware used, such as GPU or CPU models. It only mentions software frameworks and optimization algorithms.
Software Dependencies No The paper states 'We implemented our model using Tensorflow' but does not specify a version number for Tensorflow or other software dependencies.
Experiment Setup Yes The LSTM cells in both the attention network (Figure 2) and the inference network (Figure 3) had 2 hidden layers of size 100. The model and inference networks were trained using ADAM with a learning rate of 5 10 4, and a mini-batch size of 100. The same hyperparameters setting was used for all baseline models involving RNNs.