Two-Layer Feature Reduction for Sparse-Group Lasso via Decomposition of Convex Sets
Authors: Jie Wang, Jieping Ye
NeurIPS 2014 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experiments on both synthetic and real data sets show that TLFre improves the efficiency of SGL by orders of magnitude. |
| Researcher Affiliation | Academia | Jie Wang, Jieping Ye Computer Science and Engineering Arizona State University, Tempe, AZ 85287 {jie.wang.ustc, jieping.ye}@asu.edu |
| Pseudocode | No | The paper does not contain structured pseudocode or algorithm blocks. |
| Open Source Code | No | The paper mentions using a third-party solver (SLEP [9]) but does not provide concrete access to its own source code for the described methodology. |
| Open Datasets | Yes | We perform experiments on the Alzheimer s Disease Neuroimaging Initiative (ADNI) data set (http://adni.loni.usc.edu/). |
| Dataset Splits | No | The paper mentions 'cross validation' as an approach to determine parameter values but does not provide specific details on validation splits or cross-validation setup for its experiments. |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., GPU/CPU models, memory) used for running its experiments. |
| Software Dependencies | No | The paper mentions 'SLEP [9]' as the solver used but does not provide specific version numbers for this or any other software dependency. |
| Experiment Setup | Yes | Given a data set, for illustrative purposes only, we select seven values of α from {tan(ψ) : ψ = 5 , 15 , 30 , 45 , 60 , 75 , 85 }. Then, for each value of α, we run TLFre along a sequence of 100 values of λ equally spaced on the logarithmic scale of λ/λα max from 1 to 0.01. |