Semi-Supervised Optimal Margin Distribution Machines
Authors: Teng Zhang, Zhi-Hua Zhou
IJCAI 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Extensive experiments on twenty UCI data sets show that ss ODM is significantly better than compared methods, which verifies the superiority of optimal margin distribution learning. |
| Researcher Affiliation | Academia | Teng Zhang and Zhi-Hua Zhou National Key Lab for Novel Software Technology, Nanjing University, Nanjing 210023, China |
| Pseudocode | Yes | Algorithm 1 Stochastic mirror prox for ss ODM |
| Open Source Code | No | The paper does not provide any statement or link indicating that the source code for the described methodology is publicly available. |
| Open Datasets | Yes | We empirically evaluate the proposed method on twenty UCI data sets. |
| Dataset Splits | Yes | For each UCI data set, 75% of the examples are randomly chosen for training, and the rest for testing. We investigate the performance of each approach with varying amount of labeled data (namely, 5%, 10% of the labeled data). |
| Hardware Specification | No | The paper does not provide specific details about the hardware used to run the experiments. |
| Software Dependencies | No | The paper does not list specific software dependencies with version numbers required for reproducibility. |
| Experiment Setup | Yes | For all the methods, the parameters C, λ1, λ2 are selected from {1, 10, 100, 1000}. For ss ODM, ν and θ are selected from [0.2, 0.4, 0.6, 0.8]. For all data sets, both the linear and Gaussian kernels are used. In particular, the width σ of Gaussian kernel is picked from {0.25 γ, 0.5 γ, γ, 2 γ, 4 γ}, where γ is the average distance between instances. |