Multi-Layer Feature Reduction for Tree Structured Group Lasso via Hierarchical Projection

Authors: Jie Wang, Jieping Ye

NeurIPS 2015 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experiments on both synthetic and real data sets demonstrate that the speedup gained by MLFre can be orders of magnitude.
Researcher Affiliation Academia Jie Wang1, Jieping Ye1,2 1Computational Medicine and Bioinformatics 2Department of Electrical Engineering and Computer Science University of Michigan, Ann Arbor, MI 48109 {jwangumi, jpye}@umich.edu
Pseudocode Yes Algorithm 1 Hierarchical Projection: PA0 1( ).
Open Source Code No The paper mentions using 'SLEP package [15]' but does not explicitly state that the code for their proposed MLFre method is open-source or provide a link to it.
Open Datasets Yes We perform experiments on the Alzheimers Disease Neuroimaging Initiative (ADNI) data set (http://adni.loni.usc.edu/).
Dataset Splits No The paper describes the construction of synthetic data and parameter tuning but does not explicitly provide specific percentages or counts for training, validation, and test splits.
Hardware Specification No The paper does not provide specific details about the hardware used to run the experiments (e.g., CPU, GPU models, memory).
Software Dependencies No The paper mentions using 'SLEP package [15]' but does not provide specific version numbers for this or any other software dependencies.
Experiment Setup Yes For each data set, we run the solver combined with MLFre along a sequence of 100 parameter values equally spaced on the logarithmic scale of λ/λmax from 1.0 to 0.05. The data set consists of 747 patients with 406262 single nucleotide polymorphisms (SNPs).