Optimal Parallelization of Boosting
Authors: Arthur da Cunha, Mikael Møller Høgsgaard, Kasper Green Larsen
NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | In this work, we essentially close this gap by providing both improved lower bounds on the parallel complexity of weak-to-strong learners, and a parallel Boosting algorithm whose performance matches these bounds across the entire p vs. t compromise spectrum, up to logarithmic factors. Ultimately, this work settles the parallel complexity of Boosting algorithms that are nearly sample-optimal. |
| Researcher Affiliation | Academia | Arthur da Cunha Department of Computer Science Aarhus University dac@cs.au.dk Mikael Møller Høgsgaard Department of Computer Science Aarhus University hogsgaard@cs.au.dk Kasper Green Larsen Department of Computer Science Aarhus University larsen@cs.au.dk |
| Pseudocode | Yes | Algorithm 1: Proposed parallel Boosting algorithm |
| Open Source Code | No | The paper is theoretical, and we have no experiments, data or code in the paper. |
| Open Datasets | No | The paper is theoretical, and we have no experiments, data or code in the paper. |
| Dataset Splits | No | The paper is theoretical, and we have no experiments, data or code in the paper. |
| Hardware Specification | No | The paper is theoretical, and we have no experiments, data or code in the paper. |
| Software Dependencies | No | The paper is theoretical, and we have no experiments, data or code in the paper. |
| Experiment Setup | No | The paper is theoretical, and we have no experiments, data or code in the paper. |