Regret Bounds for Multilabel Classification in Sparse Label Regimes
Authors: Róbert Busa-Fekete, Heejin Choi, Krzysztof Dembczynski, Claudio Gentile, Henry Reeve, Balazs Szorenyi
NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | We fill the gap in the landscape of theoretical results by providing upper and lower finite-sample regret bounds in MLC with a focus on computational efficiency. We consider two learning setups, a nonparametric and a parametric one. ... As a last contribution, we derive MLC regret lower bounds for our MLC setups revealing that, at least in the non-parametric case, our upper bound for Hamming loss is optimal up to a log s factor, and that our regret upper bound for Precision@κ is optimal up to a log m factor. ... No experimential results. |
| Researcher Affiliation | Collaboration | Róbert Busa-Fekete Google Research busarobi@google.com Heejin Choi Google heejinc@google.com Krzysztof Dembczy nski Yahoo Research Poznan University of Technology kdembczynski@cs.put.poznan.pl Claudio Gentile Google Research cgentile@google.com Henry W. Reeve University of Bristol henry.reeve@bristol.ac.uk Balázs Szörényi Yahoo Research szorenyibalazs@gmail.com |
| Pseudocode | No | The paper does not contain any pseudocode or clearly labeled algorithm blocks. |
| Open Source Code | No | Did you include the code, data, and instructions needed to reproduce the main experimental results (either in the supplemental material or as a URL)? [N/A] No experimential results. |
| Open Datasets | No | No experimential results. ... No data provided. |
| Dataset Splits | No | No experimential results. |
| Hardware Specification | No | No experimential results. |
| Software Dependencies | No | The paper is theoretical and does not report any experiments that would require specific software dependencies with version numbers. |
| Experiment Setup | No | No experimential results. |