Active Learning Polynomial Threshold Functions

Authors: Omri Ben-Eliezer, Max Hopkins, Chutong Yang, Hantao Yu

NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical We initiate the study of active learning polynomial threshold functions (PTFs). While traditional lower bounds imply that even univariate quadratics cannot be non-trivially actively learned, we show that allowing the learner basic access to the derivatives of the underlying classifier circumvents this issue and leads to a computationally efficient algorithm for active learning degree-d univariate PTFs in O(d3 log(1/εδ)) queries. We prove that if the learner is allowed access to sign(p(i)(x)) (the i-th order derivative of p), PTFs are learnable in O(log(1/ε)) queries, but require Ω(1/ε) queries if the learner is missing access even to a single relevant derivative.
Researcher Affiliation Academia Omri Ben-Eliezer Department of Mathematics Massachusetts Institute of Technology omrib@mit.edu Max Hopkins Department of Computer Science and Engineering University of California, San Diego nmhopkin@eng.ucsd.edu Chutong Yang Department of Computer Science Stanford University yct1998@stanford.edu Hantao Yu Department of Computer Science Columbia University hantao.yu@columbia.edu
Pseudocode Yes Algorithm 1: ITERATIVE-ALGORITHM(f, S) and Algorithm 2: BATCH-KLMZ(S, m)
Open Source Code No The paper does not mention releasing source code or provide a link to a code repository. The ethics checklist states 'Did you include the code, data, and instructions needed to reproduce the main experimental results (either in the supplemental material or as a URL)? [N/A]'.
Open Datasets No This is a theoretical paper that does not perform experiments with datasets. Therefore, it does not mention public datasets for training.
Dataset Splits No This is a theoretical paper focusing on algorithmic complexity and proofs, not empirical validation. It does not mention dataset splits for training, validation, or testing.
Hardware Specification No As a theoretical paper, there are no experiments conducted that would require hardware specifications.
Software Dependencies No As a theoretical paper, there are no experiments conducted that would require specific software dependencies and versions.
Experiment Setup No As a theoretical paper, there are no experiments conducted that would require detailed experimental setup information like hyperparameters.