Forecasting Potential Diabetes Complications
Authors: Yang Yang, Walter Luyten, Lu Liu, Marie-Francine Moens, Jie Tang, Juanzi Li
AAAI 2014 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We evaluate the proposed model on a large collection of real medical records. Sparse FGM outperforms (+20% by F1) baselines significantly and gives detailed associations between diabetes complications and lab tests. In this section, we present experimental results to demonstrate the effectiveness of the proposed approach. |
| Researcher Affiliation | Academia | Yang Yang Tsinghua University yangyang@keg.cs.tsinghua.edu.cn Walter Luyten Katholieke Universiteit Leuven Walter.Luyten@med.kuleuven.be Lu Liu Northwestern University liulu26@gmail.com Marie-Francine Moens Katholieke Universiteit Leuven sien.moens@cs.kuleuven.be Jie Tang Tsinghua University jietang@tsinghua.edu.cn Juanzi Li Tsinghua University ljz@keg.cs.tsinghua.edu.cn |
| Pseudocode | Yes | Algorithm 1: Learning algorithm of Sparse FGM. |
| Open Source Code | Yes | All codes used in the paper are publicly available 3http://arnetminer.org/diabetes |
| Open Datasets | No | We use a collection of real medical records from a famous geriatric hospital. The data set spans one year, containing 181,933 medical records corresponding to 35,525 unique patients and 1, 945 kinds of lab tests in total. |
| Dataset Splits | No | In the experiments, we randomly picked 60% of the medical records as training data and the rest for testing. |
| Hardware Specification | Yes | All algorithms were implemented in C++, and all experiments were performed on a Mac running Mac OS X with Intel Core i7 2.66 GHz and 4 GB of memory. |
| Software Dependencies | No | All algorithms were implemented in C++, and LIBSVM (Chang and Lin 2011) is employed as the classification model for complication forecasting. However, no specific version numbers for software components like C++ libraries or LIBSVM are provided. |
| Experiment Setup | Yes | In all experiments, we empirically set the number of latent variables in Sparse FGM to 100, and set η = 0.1. We employ a gradient descent algorithm to learn the parameters in FGM (Tang, Zhuang, and Tang 2011), and set the learning rate parameter as 0.1. |