A Convex Formulation for Semi-Supervised Multi-Label Feature Selection

Authors: Xiaojun Chang, Feiping Nie, Yi Yang, Heng Huang

AAAI 2014 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental In this section, we conduct several experiments on large scale datasets to validate the performance of our algorithm. First we compare our algorithm with other feature selection algorithms, followed by studying the performance w.r.t. parameter sensitivity and the convergence of Algorithm 1. The experimental results demonstrate that our proposed algorithm consistently achieve superiors performances.
Researcher Affiliation Academia 1School of Information Technology & Electrical Engineering, The University of Queensland. 2Department of Computer Science and Engineering, University of Texas at Arlington.
Pseudocode Yes Algorithm 1: Optimization Algorithm for CSFS
Open Source Code No The paper does not provide an explicit statement or a concrete link to open-source code for the described methodology.
Open Datasets Yes Five datasets are adopted in the experiment, including NUS WIDE, MSRA, MRMI. We utilize three datasets, i.e. MIML Mflickr and NUS-WIDE in the experiments. ... MIML (Zhou and Zhang 2006): ... MIRFLICKR (Huiskes and Lew 2008): ... NUS-WIDE (Chua et al. 2009): ... YEAST (Elisseeff and Weston 2002): ... SCENE (Boutell et al. 2004):
Dataset Splits Yes In the experiments, we randomly generate a training set for each dataset consisting n samples, among which m% samples are labeled. Similarly to the pipeline in (Ma et al. 2012a), we randomly split the training and testing data 5 times and report average results. The lib SVM (Chang and Lin 2011) with RBF kernel is applied in the experiment. The optimal parameters of the SVM are determined by grid search on a tenfold cross-validation.
Hardware Specification No The paper does not provide any specific hardware details such as GPU or CPU models, processor types, or memory used for running the experiments.
Software Dependencies No The paper mentions 'lib SVM (Chang and Lin 2011)' but does not specify a version number for this or any other software dependency.
Experiment Setup Yes We tune all the parameters (if any) in the range of {10^-6, 10^-4, 10^-2, 10^0, 10^2, 10^4, 10^6} for each algorithm and the best results are reported. ... The optimal parameters of the SVM are determined by grid search on a tenfold cross-validation.