The Pessimistic Limits and Possibilities of Margin-based Losses in Semi-supervised Learning
Authors: Jesse Krijthe, Marco Loog
NeurIPS 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | The main conclusion from our analysis (Theorems 1 and 2) is that for classifiers defined by convex margin-based surrogate losses that are decreasing, it is impossible to come up with any semi-supervised approach that is able to guarantee safe improvement. |
| Researcher Affiliation | Academia | Jesse H. Krijthe Radboud University, The Netherlands Marco Loog Delft University of Technology, The Netherlands University of Copenhagen, Denmark |
| Pseudocode | No | The paper contains mathematical formulations and proofs but no structured pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not provide any concrete access information for source code related to its methodology. |
| Open Datasets | No | The paper is theoretical and does not mention specific datasets used for training or empirical evaluation within its own work. |
| Dataset Splits | No | The paper is theoretical and does not describe any dataset splits (train/validation/test) for empirical evaluation. |
| Hardware Specification | No | The paper is theoretical and does not mention any specific hardware used for experiments. |
| Software Dependencies | No | The paper is theoretical and does not specify any software dependencies with version numbers. |
| Experiment Setup | No | The paper is theoretical and does not describe any experimental setup details such as hyperparameters or training configurations. |