Smoothed Analysis of Sequential Probability Assignment

Authors: Alankrita Bhatt, Nika Haghtalab, Abhishek Shetty

NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical We initiate the study of smoothed analysis for the sequential probability assignment problem with contexts. We study information-theoretically optimal minmax rates as well as a framework for algorithmic reduction involving the maximum likelihood estimator oracle. Our approach establishes a general-purpose reduction from minimax rates for sequential probability assignment for smoothed adversaries to minimax rates for transductive learning. This leads to optimal (logarithmic) fast rates for parametric classes and classes with finite VC dimension. On the algorithmic front, we develop an algorithm that efficiently taps into the MLE oracle, for general classes of functions. We show that under general conditions this algorithmic approach yields sublinear regret.
Researcher Affiliation Academia Alankrita Bhatt California Institute of Technology abhatt@caltech.edu Nika Haghtalab University of California, Berkeley nika@berkeley.edu Abhishek Shetty University of California, Berkeley shetty@berkeley.edu
Pseudocode Yes Algorithm 1: Probability assignment QFTPL
Open Source Code No The paper does not provide any links to its own source code or explicitly state that its code is being released.
Open Datasets No This is a theoretical paper that focuses on mathematical proofs and algorithmic analysis rather than empirical experiments. Therefore, it does not use specific datasets or provide access information for them.
Dataset Splits No This is a theoretical paper and does not involve empirical experiments with datasets, thus no dataset splits for validation are specified.
Hardware Specification No This is a theoretical paper focusing on algorithms and mathematical proofs. It does not mention any hardware used for experiments.
Software Dependencies No This is a theoretical paper focusing on algorithms and mathematical proofs. It does not list any specific software dependencies with version numbers.
Experiment Setup No This is a theoretical paper that focuses on algorithm design and mathematical analysis. It does not describe an experimental setup with hyperparameters or training configurations.