Partial Label Learning with Unlabeled Data

Authors: Qian-Wei Wang, Yu-Feng Li, Zhi-Hua Zhou

IJCAI 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experiments on real-world data sets clearly validate the effectiveness of the proposed SSPL method. Extensive experiments on real-world partial label data sets clearly show that SSPL achieves highly competitive performance against state-of-the-art approaches.
Researcher Affiliation Academia Qian-Wei Wang , Yu-Feng Li and Zhi-Hua Zhou National Key Laboratory for Novel Software Technology, Nanjing University
Pseudocode Yes Algorithm 1 A simple solution; Algorithm 2 WGC (weighted graph construction) procedure; Algorithm 3 The proposed SSPL approach
Open Source Code No The paper does not provide any explicit statements about releasing source code or include links to code repositories for the methodology described.
Open Datasets Yes Three data sets are adopted for this task: Lost [Cour et al., 2011], Soccer Player [Zeng et al., 2013] and Yahoo!News [Guillaumin et al., 2010], MSRCv2 [Liu and Dietterich, 2012] data set is adopted for this task. The Bird Song [Briggs et al., 2012] data set is adopted for this task. Table 4: Characteristic of the real-world partial label data sets.
Dataset Splits Yes For each data set, we consider the percentage of partial label examples in the whole training set by randomly sampling p {0.05, 0.10, 0.15, 0.20, 0.30, 0.40, 0.50} instances from the whole training set with their candidate label sets and the other with no labeling information. In the rest of this section, five-fold cross-validation is performed on each real-world data set and in each training fold, the partial label instances are randomly sampled for three times.
Hardware Specification No The paper does not mention any specific hardware components (e.g., GPU/CPU models, memory, cloud instance types) used for running the experiments.
Software Dependencies No The paper mentions using specific algorithms and techniques like 'IPAL', 'PL-KNN', 'CLPL', and 'PL-SVM' but does not specify any software dependencies with version numbers (e.g., 'Python 3.x', 'PyTorch 1.x').
Experiment Setup Yes parameters employed by SSPL are set as k = 10, α = 0.70, β = 0.25, r = 0.7 and T = 100.