Direct Sparsity Optimization Based Feature Selection for Multi-Class Classification
Authors: Hanyang Peng, Yong Fan
IJCAI 2016 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | The proposed algorithm has been evaluated based on publicly available datasets. The experiments have demonstrated that our algorithm could achieve feature selection performance competitive to state-of-the-art algorithms. |
| Researcher Affiliation | Academia | Hanyang Peng1, Yong Fan2 1National Laboratory of Pattern Recognition, Institute of Automaiton, Chinese Academy of Sciences, 100190, Beijing, P.R. China 2Department of Radiology, Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, 19104, USA |
| Pseudocode | Yes | Algorithm 1. Feature Selection via Direct Sparsity Optimization (DSO-FS) |
| Open Source Code | No | No statement or link to open-source code for the paper's methodology is provided. |
| Open Datasets | Yes | The proposed algorithm has been evaluated based on 6 publicly available datasets. In particular, 2 datasets were obtained from UCI, including ISOLET and SEMEION. ... Another 2 datasets were microarray data, including LUNG and CLL-SUB-111. ... Our algorithm has also been validated based on 2 image datasets, including UMIST and AR. |
| Dataset Splits | Yes | In each trial, the samples of each dataset were randomly spitted into training and testing subsets with a ratio of 6:4. For tuning parameters, a 3-fold was used for datasets with less than 200 training samples, and an 8-fold cross-validation was used for other datasets. |
| Hardware Specification | No | No specific hardware details (e.g., GPU/CPU models, memory) used for running the experiments are provided. |
| Software Dependencies | Yes | Therefore, it can be efficiently solved by existing tools, such as CVX (CVX Research, 2011). Cvx Research, I. CVX: Matlab software for disciplined convex programming, version 2.0, http://cvxr.com/cvx, 2011. |
| Experiment Setup | Yes | The parameter 𝐶 of linear SVM classifiers were tuned using a cross-validation strategy by searching a candidate set [10-4, 10-3, 10-2, 10-1, 1, 101, 102]. The regularization parameter of ℓ1 -SVM and RFS were tuned using the same cross-validation strategy by searching a candidate set [10-3, 10-2, 10-1, 1, 101, 102, 103]. In our experiments, we first normalized all the data to have 0 mean and unit standard deviation for each feature. |