Margin-Based Generalization Lower Bounds for Boosted Classifiers
Authors: Allan Grønlund, Lior Kamma, Kasper Green Larsen, Alexander Mathiasen, Jelani Nelson
NeurIPS 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | In this paper, we give the first margin-based lower bounds on the generalization error of boosted classifiers. Our lower bounds nearly match the kth margin bound and thus almost settle the generalization performance of boosted classifiers in terms of margins. The main argument that lies in the heart of both proofs is a probabilistic method argument. |
| Researcher Affiliation | Academia | Department of Computer Science, Aarhus University, {jallan,lior.kamma,larsen,alexmath}@cs.au.dk Department of EECS, UC Berkeley, minilek@berkeley.edu |
| Pseudocode | Yes | Algorithm 1: Construct a Voting Classifier |
| Open Source Code | No | The paper refers to an arXiv preprint of itself ([GKL+19] ar Xiv:1909.12518) for deferred proofs, not a code repository. There is no explicit statement about releasing source code for the methodology. |
| Open Datasets | No | This is a theoretical paper and does not conduct experiments on datasets that would require training data access. |
| Dataset Splits | No | This is a theoretical paper and does not involve experimental validation with dataset splits. |
| Hardware Specification | No | This is a theoretical paper and does not describe any experimental setup or hardware specifications. |
| Software Dependencies | No | This is a theoretical paper and does not mention any software dependencies with specific version numbers. |
| Experiment Setup | No | This is a theoretical paper and does not describe any experimental setup details such as hyperparameters or training configurations. |