Extending the Modelling Capacity of Gaussian Conditional Random Fields while Learning Faster

Authors: Jesse Glass, Mohamed Ghalwash, Milan Vukicevic, Zoran Obradovic

AAAI 2016 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Benefits of the proposed model in terms of improved accuracy and speed are characterized on several synthetic graphs with 2 million links as well as on a hospital admissions prediction task represented as a human disease-symptom similarity network corresponding to more than 35 million hospitalization records in California over 9 years.
Researcher Affiliation Academia Jesse Glass Temple University Philadelphia, USA tud25892@temple.edu Mohamed Ghalwash Temple University Philadelphia, USA tuc30491@temple.edu Milan Vukicevic University of Belgrade Belgrade, Serbia vukicevicm@fon.bg.ac.rs Zoran Obradovic Temple University Philadelphia, USA zobrad@gmail.com
Pseudocode No The paper describes the mathematical formulation and optimization steps but does not include any clearly labeled pseudocode or algorithm blocks.
Open Source Code No That table is publicly available at http://astro.temple.edu/ tud25892.
Open Datasets Yes We evaluated Um GCRF on the problem of predicting monthly hospital admissions for 189 classes of diseases in California from HCUP data (HCUP 2011). HCUP. 2011. HCUP State Inpatient Databases (SID). Healthcare Cost and Utilization Project (HCUP). 2005-2009. Agency for Healthcare Research and Quality, Rockville, MD. http://www.hcupus.ahrq.gov//sidoverview.jsp.
Dataset Splits No We train on the first 80 months and test on the remaining 27.
Hardware Specification No The following speed tests were done in Matlab with a single feature per target variable.
Software Dependencies No The following speed tests were done in Matlab with a single feature per target variable.
Experiment Setup Yes NN had 26 hidden nodes. The algorithm was tested 100 times because the NN is non-convex and yields different results each time.