Partially Linear Additive Gaussian Graphical Models

Authors: Sinong Geng, Minhao Yan, Mladen Kolar, Sanmi Koyejo

ICML 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Empirically, the PLA-GGM is applied to both synthetic and real-world datasets, demonstrating superior performance compared to competing methods. We apply the PLA-GGM to the 1000 Functional Connectomes Project Cobre dataset (COBRE, 2019), from the Center for Biomedical Research Excellence.
Researcher Affiliation Academia 1Department of Computer Science, Princeton University, Princeton, New Jersey, USA 2Charles H. Dyson School of Applied Economics and Management, Ithaca, New York, USA 3Booth School of Business, University of Chicago, Chicago, Illinois, USA 4Department of Computer Science, University of Illinois at Urbana-Champaign, University of Illinois Urbana-Champaign, Champaign, Illinois, USA.
Pseudocode No The paper does not contain any pseudocode or clearly labeled algorithm blocks.
Open Source Code No The paper does not provide an explicit statement about releasing source code or a link to a code repository for their method.
Open Datasets Yes We apply the PLA-GGM to the 1000 Functional Connectomes Project Cobre dataset (COBRE, 2019), from the Center for Biomedical Research Excellence. COBRE. The center for biomedical research excellence. http://fcon_1000.projects.nitrc. org/indi/retro/cobre.html, 2019. (Accessed on 01/14/2019).
Dataset Splits No The paper mentions '10-fold cross validation' for regularization parameter selection but does not provide explicit train/validation/test dataset splits (e.g., percentages or sample counts) for the main data used in experiments.
Hardware Specification No The paper does not specify any hardware used for running the experiments (e.g., CPU, GPU models, or cloud computing resources).
Software Dependencies No The paper mentions using 'the R package glmnet (Friedman et al., 2010)' but does not provide a specific version number for this software dependency.
Experiment Setup Yes The regularization parameter λ is selected by 10-fold cross validation from a series of auto generated λ s by glmnet. We use 1{|g| g } = 1 exp 100g2 /2 for the following analysis, which is equivalent to g = 0.578. We use an L1-regularized logistic regression as this is a common approach in the literature.