beta-risk: a New Surrogate Risk for Learning from Weakly Labeled Data
Authors: Valentina Zantedeschi, Rémi Emonet, Marc Sebban
NeurIPS 2016 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | finally, we report experiments in semi-supervised learning and learning with label noise, conducted on classical datasets from the UCI repository [15], in order to compare our algorithm with the state of the art approaches. |
| Researcher Affiliation | Academia | Univ Lyon, UJM-Saint-Etienne, CNRS, Institut d Optique Graduate School, Laboratoire Hubert Curien UMR 5516, F-42023, SAINT-ETIENNE, France |
| Pseudocode | No | The paper describes the iterative algorithm in Section 3 in paragraph text, but it does not present it as a structured pseudocode or algorithm block. |
| Open Source Code | No | The paper provides a personal website link (http://vzantedeschi.com/) which does not directly lead to the source code for the methodology described. It also states the implementation details: 'The iterative algorithm with β-SVM is implemented in Python using Cvxopt (for optimizing β-SVM ) and Cvxpy 2 with its Ecos solver [9].', but does not explicitly provide access to their own source code. |
| Open Datasets | Yes | conducted on classical datasets from the UCI repository [15] |
| Dataset Splits | Yes | For each proportion of labeled data, we perform a 4-fold cross-validation and we show the average accuracy over 10 iterations. |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., GPU/CPU models, memory) used for running the experiments. |
| Software Dependencies | Yes | The iterative algorithm with β-SVM is implemented in Python using Cvxopt (for optimizing β-SVM ) and Cvxpy 2 with its Ecos solver [9]. |
| Experiment Setup | Yes | Concerning the hyper-parameters of the different methods, we fix c2 of β-SVM to c1 ml m , c1 of Well SVM to 1 as explained in [14] and all the other hyper-parameters (c1 for β-SVM and c2 for Well SVM) are tuned by cross-validation through grid search. As for the stopping criteria, we fix ϵ of β-SVM to 10 5 + 10 3 h F and ϵ of Well SVM to 10 3 and the maximal number of iterations to 20 for both methods. |