Revisiting Perceptron: Efficient and Label-Optimal Learning of Halfspaces

Authors: Songbai Yan, Chicheng Zhang

NeurIPS 2017 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical In this work, we propose an efficient Perceptron-based algorithm for actively learning homogeneous halfspaces... Furthermore, we show that our active learning algorithm can be converted to an efficient passive learning algorithm that has near-optimal sample complexities with respect to and d.
Researcher Affiliation Collaboration Songbai Yan UC San Diego La Jolla, CA yansongbai@ucsd.edu Chicheng Zhang Microsoft Research New York, NY chicheng.zhang@microsoft.com
Pseudocode Yes Algorithm 1 ACTIVE-PERCEPTRON; Algorithm 2 MODIFIED-PERCEPTRON
Open Source Code No The paper does not provide any explicit statement or link for the availability of its source code.
Open Datasets No The paper is theoretical and analyzes algorithms under assumed data distributions (e.g., uniform distribution over the unit sphere). It does not use specific public datasets or provide access information for them.
Dataset Splits No The paper is theoretical and does not involve empirical evaluations with dataset splits. Therefore, it does not specify training, validation, or test splits.
Hardware Specification No The paper is theoretical and focuses on algorithm design and analysis. It does not mention any specific hardware used for running experiments.
Software Dependencies No The paper is theoretical and describes algorithms and proofs. It does not mention any specific software dependencies with version numbers required for implementation or reproduction.
Experiment Setup No The paper is theoretical and does not describe an experimental setup, hyperparameters, or training configurations for empirical evaluations.