Online F-Measure Optimization
Authors: Róbert Busa-Fekete, Balázs Szörényi, Krzysztof Dembczynski, Eyke Hüllermeier
NeurIPS 2015 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Moreover, first experimental results are presented, showing that our method performs well in practice. |
| Researcher Affiliation | Academia | R obert Busa-Fekete Department of Computer Science University of Paderborn, Germany; Bal azs Sz or enyi Technion, Haifa, Israel / MTA-SZTE Research Group on Artificial Intelligence, Hungary; Krzysztof Dembczy nski Institute of Computing Science Pozna n University of Technology, Poland; Eyke H ullermeier Department of Computer Science University of Paderborn, Germany |
| Pseudocode | Yes | Algorithm 1 OFO |
| Open Source Code | No | The paper does not contain an explicit statement or a direct link indicating that the source code for the described methodology is publicly available. |
| Open Datasets | Yes | We used in the experiments nine datasets taken from the Lib SVM repository of binary classification tasks.4 [Footnote 4: http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/datasets/binary.html] |
| Dataset Splits | Yes | We run OFO along with the three classifiers trained on 80% of the data. The rest 20% of the data was used to evaluate g t t in terms of the F-measure. ... As a baseline, we applied the 2S approach. More concretely, we trained the same set of learners on 60% of the data and validated the threshold on 20% by optimizing (6). |
| Hardware Specification | No | The paper does not provide specific hardware details such as exact CPU or GPU models, processor types, or memory amounts used for running the experiments. |
| Software Dependencies | No | The paper mentions software components like "Logistic Regression (LOGREG)", "Perceptron algorithm", and "PEGASOS" but does not specify their version numbers or the versions of any underlying programming languages or libraries. |
| Experiment Setup | Yes | The hyperparameters of the learning methods are chosen based on the performance of 2S. We tuned the hyperparameters in a wide range of values which we report in Appendix D. |