Towards a Combinatorial Characterization of Bounded-Memory Learning

Authors: Alon Gonen, Shachar Lovett, Michal Moshkovitz

NeurIPS 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical We prove both upper and lower bounds for our candidate solution, that match in some regime of parameters. This is the first characterization of strong learning under space constraints in any regime. In this section we prove our upper bounds: Theorem 1 and Theorem 3. In this section we prove our lower bounds: Theorem 2 and Theorem 4.
Researcher Affiliation Academia Alon Gonen Shachar Lovett Michal Moshkovitz University of California San Diego
Pseudocode No The paper describes algorithms like Boosting-By-Majority conceptually and references other works for details, but it does not contain structured pseudocode or algorithm blocks within its text.
Open Source Code No The paper is theoretical and does not describe a software implementation or provide any links or statements about releasing source code for a methodology.
Open Datasets No The paper is purely theoretical and does not perform experiments involving datasets, so no information about publicly available training datasets is provided.
Dataset Splits No The paper is theoretical and does not conduct experiments with data, thus it does not mention training, validation, or test dataset splits.
Hardware Specification No The paper is theoretical and does not describe computational experiments, so no specific hardware specifications are mentioned.
Software Dependencies No The paper is theoretical and does not describe any software implementations or dependencies with version numbers.
Experiment Setup No The paper is theoretical and does not involve empirical experiments, therefore no experimental setup details such as hyperparameters or training settings are provided.