ATTAIN: Attention-based Time-Aware LSTM Networks for Disease Progression Modeling
Authors: Yuan Zhang, Xi Yang, Julie Ivy, Min Chi
IJCAI 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We validate ATTAIN on modeling the progression of an extremely challenging disease, septic shock, by using real-world EHRs. Our results demonstrate that the proposed framework outperforms the state-of-the-art models such as RETAIN and T-LSTM. |
| Researcher Affiliation | Academia | Yuan Zhang1 , Xi Yang1 , Julie Ivy2 and Min Chi1 1Computer Science, North Carolina State University 2Industrial and System Engineering, North Carolina State University 1{yzhang93, yxi2, mchi}@ncsu.edu, 2jsivy@ncsu.edu |
| Pseudocode | No | The paper provides mathematical equations and descriptions of the model architecture but does not include pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not provide any statement or link regarding the availability of its source code. |
| Open Datasets | No | Our EHR data was collected from Christiana Care Health System Health System (CCHS) from July, 2013 to December, 2015. |
| Dataset Splits | Yes | In training process, we randomly divide the data sets into the training, validation and testing set with the ratio of 70%, 15%, and 15%. |
| Hardware Specification | No | The paper does not specify any hardware details (e.g., GPU models, CPU types) used for running the experiments. |
| Software Dependencies | No | The paper does not specify any software dependencies with version numbers. |
| Experiment Setup | Yes | The training epochs is 50 with early stopping, the learning rate is 0.01, and the number of hidden units for LSTM is 72. |