Notice: The reproducibility variables underlying each score are classified using an automated LLM-based pipeline, validated against a manually labeled dataset. LLM-based classification introduces uncertainty and potential bias; scores should be interpreted as estimates. Full accuracy metrics and methodology are described in [1].

Tensor-based multi-view label enhancement for multi-label learning

Authors: Fangwen Zhang, Xiuyi Jia, Weiwei Li

IJCAI 2020 | Venue PDF | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Extensive comparative studies validate that the performance of multi-view multi-label learning can be improved significantly with TMV-LE.
Researcher Affiliation Academia 1Key Laboratory of Information Perception and Systems for Public Security of MIIT, Nanjing University of Science and Technology, China 2Jiangsu Key Lab of Image and Video Understanding for Social Security, Nanjing University of Science and Technology, China 3State Key Laboratory for Novel Software Technology, Nanjing University, China 4College of Astronautics, Nanjing University of Aeronautics and Astronautics, China EMAIL, EMAIL
Pseudocode No The paper describes an optimization framework with alternating methods but does not provide a formal pseudocode or algorithm block.
Open Source Code No The paper does not provide any explicit statement about releasing source code or a link to a code repository for the methodology described.
Open Datasets Yes Corel5k [Duygulu et al., 2002] and PASCAL VOC [Everingham et al., 2010] are two image recognition datasets.
Dataset Splits Yes For each dataset, ten-fold crossvalidation is performed where the mean results and standard deviations are recorded for all comparing algorithms.
Hardware Specification No The paper does not provide specific hardware details (e.g., GPU/CPU models, memory) used for running the experiments.
Software Dependencies No The paper mentions several algorithms and frameworks (e.g., ML-kNN, F2L21F, ADMM, L-BFGS) but does not specify their version numbers for reproducibility.
Experiment Setup Yes For ML-k NN(B) and ML-k NN(C), the parameter k is set to 10. In F2L21F, the parameters λ1 and λ2 are both set 10. For LSA-MML, the parameter r is chosen among {2, 3, 4, 5}, the parameters α and β are chosen among {0.01, 0.1, 1, 10, 100}. In lr MMC, the parameter µ is determined as in MC-1, and the parameter γ is tuned over the set {10i|i = 4, 3, ..., 3}. For TMV-LE, the parameters {λi|i = 1, 2, 3, 4}, α and β are chosen among {0.01, 0.1, 1, 10, 100}, the parameter k is set to 10. And the clustering method used in TMV-LE is spectral clustering. For GLLE, the parameter λ is chosen among {0.01, 0.1, ..., 100}, and the number of neighbors K is set to c + 1, c is the number of labels. The kernel function in GLLE is Gaussian kernel.