A Competitive Algorithm for Agnostic Active Learning

Authors: Yihan Zhou, Eric Price

NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical We take a different approach to agnostic active learning, getting an algorithm that is competitive with the optimal algorithm for any binary hypothesis class H and distribution DX over X. In particular, if any algorithm can use m queries to get O(η) error, then our algorithm uses O(m log |H|) queries to get O(η) error. Our main result is just such a competitive bound.
Researcher Affiliation Academia Eric Price Department of Computer Science University of Texas at Austin ecprice@cs.utexas.edu Yihan Zhou Department of Computer Science University of Texas at Austin joeyzhou@cs.utexas.edu
Pseudocode Yes Algorithm 1 Competitive Algorithm for Active Agnostic Learning
Open Source Code No No explicit statement about open-sourcing code or a link to a code repository is provided.
Open Datasets No The paper is theoretical and does not describe experiments using specific datasets, thus no information on public dataset availability for training is provided.
Dataset Splits No The paper is theoretical and does not describe experiments using specific datasets, thus no information on training/test/validation splits is provided.
Hardware Specification No The paper is theoretical and does not include details on specific hardware used for experiments.
Software Dependencies No The paper is theoretical and does not describe specific software dependencies with version numbers.
Experiment Setup No The paper is theoretical and does not provide details on experimental setup, hyperparameters, or training settings.