Learning Label-Specific Multiple Local Metrics for Multi-Label Classification

Authors: Jun-Xiang Mao, Jun-Yi Hang, Min-Ling Zhang

IJCAI 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental 4 Experiments Comprehensive experiments on benchmark multi-label datasets validate the superiority of LSMM in learning effective similarity metrics for multi-label classification.
Researcher Affiliation Academia 1School of Computer Science and Engineering, Southeast University, Nanjing 210096, China 2Key Laboratory of Computer Network and Information Integration (Southeast University), Ministry of Education, China {maojx, hangjy, zhangml}@seu.edu.cn
Pseudocode Yes The complete procedure of LSMM can be found in Appendix A.
Open Source Code No No explicit statement or link providing access to the source code for the methodology described in this paper was found.
Open Datasets Yes Nine benchmark multi-label datasets with diversified properties are employed for comprehensive performance evaluation. Table 1 summarizes the characteristics of each experimental dataset D... 1 http://mulan.sourceforge.net/datasets.html 2 http://palm.seu.edu.cn/zhangml/Resources.htm#data 3 https://waikato.github.io/meka/datasets/
Dataset Splits Yes We take out 10% examples in each dataset as a hold out validation set for hyperparameter searching and perform ten-fold crossvalidation on the remaining 90% examples to evaluate the above approaches on the nine benchmark multi-label datasets.
Hardware Specification No No specific hardware details (e.g., GPU/CPU models, processor types, memory amounts, or detailed computer specifications) used for running experiments were provided.
Software Dependencies No No specific software dependencies with version numbers (e.g., Python 3.8, PyTorch 1.9, CUDA 11.1) were provided.
Experiment Setup Yes In this paper, kt and ki are set to 20. γ and α are fixed to 2 and 0.4 respectively, and the smooth hinge loss is used to instantiate ℓ( ). In this paper, C is set to 3. For the proposed LSMM-SE and LSMM-CL approaches, regularization parameters λ1 and λ2 are searched in {10 1, 1, . . . , 103} and {10 3, 10 2, . . . , 10} respectively. The number of nearest neighbors (denoted as k) in KNN and ML-KNN is set to 10.