Active Learning with Neural Networks: Insights from Nonparametric Statistics
Authors: Yinglun Zhu, Robert Nowak
NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | Our paper is theoretical in nature, and there is no negative societal impact of our work in the foreseeable future. rigorous label complexity guarantees of deep active learning have remained elusive. This constitutes a significant gap between theory and practice. This paper tackles this gap by providing the first near-optimal label complexity guarantees for deep active learning. |
| Researcher Affiliation | Academia | Yinglun Zhu Department of Computer Sciences University of Wisconsin-Madison Madison, WI 53706 yinglun@cs.wisc.edu Robert Nowak Department of Electrical and Computer Engineering University of Wisconsin-Madison Madison, WI 53706 rdnowak@wisc.edu |
| Pseudocode | Yes | Algorithm 1 Neural CAL Algorithm 2 Neural CAL++ |
| Open Source Code | No | The checklist section indicates N/A for code availability: 'Did you include the code, data, and instructions needed to reproduce the main experimental results (either in the supplemental material or as a URL)? [N/A]' |
| Open Datasets | No | The paper is theoretical in nature and does not report on experimental results, therefore no datasets were used for training. |
| Dataset Splits | No | The paper is theoretical and does not describe experiments, therefore no training, validation, or test splits are provided. |
| Hardware Specification | No | The paper is theoretical in nature and does not report on experimental results, therefore no hardware specifications are provided. |
| Software Dependencies | No | The paper is theoretical in nature and does not report on experimental results, therefore no specific software dependencies with version numbers are provided. |
| Experiment Setup | No | The paper is theoretical in nature and does not report on experimental results, therefore no experimental setup details like hyperparameters or training configurations are provided. |