Multi-label Feature Selection via Global Relevance and Redundancy Optimization

Authors: Jia Zhang, Yidong Lin, Min Jiang, Shaozi Li, Yong Tang, Kay Chen Tan

IJCAI 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Empirical studies on twenty multi-label data sets reveal the effectiveness and efficiency of the proposed method.
Researcher Affiliation Academia Jia Zhang1 , Yidong Lin2 , Min Jiang1 , Shaozi Li1, , Yong Tang3 and Kay Chen Tan4 1Department of Artificial Intelligence, Xiamen University, China 2School of Mathematical Sciences, Xiamen University, China 3School of Computer Science, South China Normal University, China 4Department of Computer Science, City University of Hong Kong, Hong Kong {j.zhang, linyidong}@stu.xmu.edu.cn, {minjiang, szlig}@xmu.edu.cn, ytang@m.scnu.edu.cn, kaytan@cityu.edu.hk
Pseudocode No The paper describes the proposed method using mathematical formulations and descriptive text, but it does not include a clearly labeled pseudocode or algorithm block.
Open Source Code Yes Our implementation of the proposed method is available online at: https://jiazhang-ml.pub/GRRO-master.zip.
Open Datasets Yes A total of twenty benchmark multi-label data sets are employed in the experiment2. Table 1 summarizes detailed characteristics of these data sets, which are mainly from the domains including text, multimedia, and biology. We use the same train/test splits of these data sets to report and compare the results. 2Public available at http://www.uco.es/kdis/mllresources/
Dataset Splits No The paper mentions 'train/test splits' and 'parameter-tuning' based on 'average classification result (ACR) on test data', but it does not specify a distinct validation set or validation split for hyperparameter tuning.
Hardware Specification Yes Experiments are performed on a PC with an Intel i7-7700K 4.20GHz CPU and 32GB RAM.
Software Dependencies No The paper mentions using 'ML-KNN [Zhang and Zhou, 2007] (with the default setting)' and solving the matrix equation using 'the Lyapunov function in Matlab'. However, specific version numbers for Matlab or ML-KNN are not provided.
Experiment Setup Yes For the proposed method, both of α and β are searched in {10 3, 10 2, ..., 103}, and k is searched in {5, 10, ..., 50}. For the parameter-tuning, we adopt a grid-search strategy to seek the optimal parameter, which is determined by making the average classification result (ACR) on test data smallest.