Multiclass Transductive Online Learning

Authors: Steve Hanneke, Vinod Raman, Amirreza Shaeiri, Unique Subedi

NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical In this paper, our main contribution is algorithmically answering the following question in the multiclass transductive online learning framework: Given a concept class C YX , what is the minimum expected number of mistakes achievable by a learner against any realizable adversary? To prove this result, we give another combinatorial dimension, termed the Level-constrained Branching dimension, and show that its finiteness characterizes constant minimax expected mistake-bounds.
Researcher Affiliation Academia Steve Hanneke Department of Computer Science Purdue University West Lafayette, IN 47907 steve.hanneke@gmail.com Vinod Raman Department of Statistics University of Michigan Ann Arbor, MI 48104 vkraman@umich.edu Amirreza Shaeri Department of Computer Science Purdue University West Lafayette, IN 47907 amirreza.shaeiri@gmail.com Unqiue Subedi Department of Statistics University of Michigan Ann Arbor, MI 48104 subedi@umich.edu
Pseudocode No The paper describes algorithms in paragraph form, such as 'For every t {1, . . . , T}, if we have {c(xt) : c Vt} = {y}, then predict ˆyt = y. Otherwise, define V y t = {c Vt : c(xt) = y} for all y Y, and predict ˆyt = arg maxy Y B(V y t , xt+1:T ).', but it does not include explicitly labeled 'Algorithm' or 'Pseudocode' blocks.
Open Source Code No The paper does not contain any explicit statements about releasing source code for the methodology, nor does it provide any links to a code repository.
Open Datasets No This is a theoretical paper that does not involve empirical studies or the use of datasets for training, validation, or testing.
Dataset Splits No This is a theoretical paper that does not involve empirical studies or the use of datasets for training, validation, or testing, thus no dataset splits are discussed.
Hardware Specification No This paper is purely theoretical and does not describe any experiments, therefore, no hardware specifications are mentioned.
Software Dependencies No This is a theoretical paper without experimental implementation details, and thus no software dependencies with version numbers are mentioned.
Experiment Setup No This paper is theoretical and does not describe any experimental setup, hyperparameters, or training configurations.