Multi-Class Learning using Unlabeled Samples: Theory and Algorithm
Authors: Jian Li, Yong Liu, Rong Yin, Weiping Wang
IJCAI 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Coinciding with the theoretical analysis, experimental results demonstrate that the stated approach achieves better performance. |
| Researcher Affiliation | Academia | 1Institute of Information Engineering, Chinese Academy of Sciences 2School of Cyber Security, University of Chinese Academy of Sciences {lijian9026, liuyong, yinrong, wangweiping}@iie.ac.cn |
| Pseudocode | Yes | Algorithm 1 Proximal Stochastic Sub-gradient Singular Value Thresholding (PS3VT) |
| Open Source Code | No | The paper does not provide an explicit statement or link for the open-sourcing of its methodology's code. |
| Open Datasets | Yes | We run PS3VT and the compared methods on 15 multi-class datasets and report the results in Table 3. Labeled and unlabeled samples are given by stratified random sampling from train data that 30% as labeled samples and the rest as unlabeled ones. |
| Dataset Splits | Yes | For fair comparison, before a method runs on any dataset, we employ 5-folds cross validation to obtain the optimal parameter set by grid search over candidate sets complexity parameter τA {10 15, 10 14, , 10 6}, unlabeled samples parameter τI {0, 10 15, 10 14, , 10 6}, local Rademacher complexity parameter τS {0, 10 10, 10 9, , 10 1}, step size 1 µ {101, 102, , 105} and tail parameter θ {0.5, 0.6, , 0.9} min(|K|, |d|). |
| Hardware Specification | No | The paper does not provide specific details regarding the hardware used for running the experiments. |
| Software Dependencies | No | The paper does not provide specific version numbers for any software dependencies used in the experiments. |
| Experiment Setup | Yes | For fair comparison, before a method runs on any dataset, we employ 5-folds cross validation to obtain the optimal parameter set by grid search over candidate sets complexity parameter τA {10 15, 10 14, , 10 6}, unlabeled samples parameter τI {0, 10 15, 10 14, , 10 6}, local Rademacher complexity parameter τS {0, 10 10, 10 9, , 10 1}, step size 1 µ {101, 102, , 105} and tail parameter θ {0.5, 0.6, , 0.9} min(|K|, |d|). |