Notice: The reproducibility variables underlying each score are classified using an automated LLM-based pipeline, validated against a manually labeled dataset. LLM-based classification introduces uncertainty and potential bias; scores should be interpreted as estimates. Full accuracy metrics and methodology are described in [1].

Efficient Active Learning with Abstention

Authors: Yinglun Zhu, Robert Nowak

NeurIPS 2022 | Venue PDF | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical Our paper is theoretical in nature, and there is no negative societal impact of our work in the foreseeable future.
Researcher Affiliation Academia Yinglun Zhu Department of Computer Sciences University of Wisconsin-Madison Madison, WI 53706 EMAIL Robert Nowak Department of Electrical and Computer Engineering University of Wisconsin-Madison Madison, WI 53706 EMAIL
Pseudocode Yes Algorithm 1 Ef๏ฌcient Active Learning with Abstention
Open Source Code No The paper is theoretical and does not mention releasing any source code or provide links to a repository.
Open Datasets No The paper is theoretical and does not conduct experiments on datasets, so there is no mention of public dataset availability.
Dataset Splits No The paper is theoretical and does not conduct experiments with dataset splits for training, validation, or testing.
Hardware Specification No The paper is theoretical and does not describe any experimental setup or the hardware used to run experiments.
Software Dependencies No The paper is theoretical and does not detail any software dependencies with specific version numbers for experimental reproducibility.
Experiment Setup No The paper is theoretical and focuses on algorithm design and proofs, not experimental setup details like hyperparameters or training configurations.