Robust Flexible Feature Selection via Exclusive L21 Regularization

Authors: Di Ming, Chris Ding

IJCAI 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Extensive experiments on twelve benchmark datasets demonstrate the effectiveness of the proposed regularization and the optimization algorithm as compared to state-of-the-arts.
Researcher Affiliation Academia Di Ming and Chris Ding Department of Computer Science and Engineering, University of Texas at Arlington, USA initialdiming@yahoo.com, chqding@uta.edu
Pseudocode Yes Algorithm 1 Search the largest coordinate τ of S. and Algorithm 2 ALM based optimization algorithm for solving the exclusive ℓ2,1 regularization in problem (14).
Open Source Code No No explicit statement about releasing the source code for the described methodology or a link to a code repository was found.
Open Datasets Yes Experiments on twelve benchmark datasets are conducted to evaluate the performance of feature selection methods on classification. Among those benchmarks, there are 4 image datasets: MNIST3 [Lecun et al., 1998], Yale4, Yale B5 , PIE [Sim et al., 2002]; 1 spoken letter recognition dataset: ISOLET6; 5 bio-microarray datasets: Carcinomas [Yang et al., 2006], Lung [Bhattacharjee et al., 2001], Glioma [Nutt et al., 2003], TOX6, Tumor-14 [Ramaswamy et al., 2001]; and 2 text datasets: CNAE-9 [Ciarelli and Oliveira, 2009], 20Newsgroups7. Footnotes provide links: http://vision.ucsd.edu/content/yale-face-database, http://www.cad.zju.edu.cn/home/dengcai/Data/Face Data.html, http://featureselection.asu.edu/datasets.php, http://qwone.com/ jason/20Newsgroups/
Dataset Splits Yes To evaluate the performance on classification, 5-fold crossvalidation accuracy with SVM as classifier are computed on average.
Hardware Specification No No specific hardware details such as GPU/CPU models, memory, or cloud instance types used for experiments were mentioned in the paper.
Software Dependencies No The paper mentions 'LIBSVM [Chang and Lin, 2011] is used as the practical implementation of SVM', but it does not provide a specific version number for LIBSVM or any other software dependencies.
Experiment Setup Yes LIBSVM [Chang and Lin, 2011] is used as the practical implementation of SVM, where kernel is set as linear and parameter C is set as 1 for all the experiments. For the proposed algorithm, parameters are initialized as: t = 0, νt = 1/ X F , ρ = 1.1, ϵ1 = 1e 8, ϵ2 = 1e 5, Λt = 0, random initialization weights Wt. Also, for convergence study, α=1, β =1.