Fast Sparse Gaussian Markov Random Fields Learning Based on Cholesky Factorization

Authors: Ivan Stojkovic, Vladisav Jelisavcic, Veljko Milutinovic, Zoran Obradovic

IJCAI 2017 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We evaluated the speed and solution quality of the newly proposed SCHL method on problems consisting of up to 24,840 variables. Our approach was several times faster than three state-of-the-art approaches. We also demonstrate that SCHL can be used to discover interpretable networks, by applying it to a high impact problem from the health informatics domain.
Researcher Affiliation Academia 1Center for Data Analytics and Biomedical Informatics, Temple University, Philadelphia, PA, USA 2Mathematical Institute, Serbian Academy of Sciences and Arts, Belgrade, Serbia 3School of Electrical Engineering, University of Belgrade, Belgrade, Serbia
Pseudocode Yes Algorithm 1 Coordinate Descent
Open Source Code No The paper mentions using code for other methods (QUIC and BCDIC) and re-implementing CSEPNL, but it does not provide an explicit statement or link to the source code for their proposed SCHL method.
Open Datasets Yes The dataset contains whole blood gene expression profiles collected daily for up to 5 days from 53 subjects [Parnell et al., 2013].
Dataset Splits No The paper describes the generation of synthetic data and the number of samples used for learning (1000 samples), but it does not specify explicit train/validation/test splits in terms of percentages or counts for model training or evaluation within these datasets.
Hardware Specification Yes All experiments were conducted in a single thread Intel(R) Core(TM) i7-4770 CPU @ 3.40GHz machine with 32 GB RAM.
Software Dependencies No The paper mentions using 'code for the QUIC1 and BCDIC2 provided by their respective authors' and re-implementing 'CSEPNL'. However, it does not specify any version numbers for these or any other software dependencies like programming languages or libraries.
Experiment Setup Yes Table 2: Comparison of quality indicators for SCHL and three alternative methods for learning the sparse precision matrix. All approaches produced solutions of comparable quality according to five metrics. SCHL ... Lambda 0.4