Efficient Active Learning with Abstention
Authors: Yinglun Zhu, Robert Nowak
NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | Our paper is theoretical in nature, and there is no negative societal impact of our work in the foreseeable future. |
| Researcher Affiliation | Academia | Yinglun Zhu Department of Computer Sciences University of Wisconsin-Madison Madison, WI 53706 yinglun@cs.wisc.edu Robert Nowak Department of Electrical and Computer Engineering University of Wisconsin-Madison Madison, WI 53706 rdnowak@wisc.edu |
| Pseudocode | Yes | Algorithm 1 Efficient Active Learning with Abstention |
| Open Source Code | No | The paper is theoretical and does not mention releasing any source code or provide links to a repository. |
| Open Datasets | No | The paper is theoretical and does not conduct experiments on datasets, so there is no mention of public dataset availability. |
| Dataset Splits | No | The paper is theoretical and does not conduct experiments with dataset splits for training, validation, or testing. |
| Hardware Specification | No | The paper is theoretical and does not describe any experimental setup or the hardware used to run experiments. |
| Software Dependencies | No | The paper is theoretical and does not detail any software dependencies with specific version numbers for experimental reproducibility. |
| Experiment Setup | No | The paper is theoretical and focuses on algorithm design and proofs, not experimental setup details like hyperparameters or training configurations. |