Hypothesis Selection with Memory Constraints

Authors: Maryam Aliakbarpour, Mark Bun, Adam Smith

NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical Our main result is an algorithm that achieves a nearly optimal tradeoff between memory usage and sample complexity. In particular, given b bits of memory (for b roughly between log n and n), our algorithm solves the hypothesis selection problem with s samples, where b s = O(n log n). This result is optimal up to an O(log n) factor, for all b.
Researcher Affiliation Academia Maryam Aliakbarpour Department of Computer Science Rice University Houston, TX 77005 maryama@rice.edu Mark Bun Department of Computer Science Boston University Boston, MA 02215 mbun@bu.edu Adam Smith Department of Computer Science Boston University Boston, MA 02215 ads22@bu.edu
Pseudocode Yes Algorithm 1 Choosing between two hypotheses; Algorithm 2 Random ladder tournament; Algorithm 3 Hypothesis selection with no decent hypothesis; Algorithm 4 Filtering Algorithm; Algorithm 5 Reduction from a proper learner without promise to a proper learner with promise
Open Source Code No The paper does not contain any explicit statement about releasing source code or provide links to a code repository for the described methodology.
Open Datasets No The paper is theoretical and does not conduct experiments with specific datasets; therefore, it does not provide access information for a training dataset.
Dataset Splits No The paper is theoretical and does not describe experimental setup, thus no dataset splits for training, validation, or testing are provided.
Hardware Specification No The paper is theoretical and does not describe any specific hardware used for experiments or computations.
Software Dependencies No The paper is theoretical and does not list any specific software dependencies with version numbers.
Experiment Setup No The paper is theoretical and focuses on algorithm design and analysis, thus it does not provide details on an experimental setup, hyperparameters, or training configurations.