Projection-Free Bandit Optimization with Privacy Guarantees

Authors: Alina Ene, Huy L. Nguyen, Adrian Vladu7322-7330

AAAI 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical We design differentially private algorithms for the bandit convex optimization problem in the projection-free setting. This is the first differentially-private algorithm for projection-free bandit optimization, and in fact our bound matches the best known non-private projection-free algorithm and the best known private algorithm, even for the weaker setting when projections are available. Our main focus is on the dependency on the dimension n of the ambient space, the number T of iterations, and the privacy budget ε. For ease of comparison, we use the e O notation to hide poly-logarithmic factors in n and T, as well as parameters such as the Lipschitz constant of the loss functions. The precise guarantees can be found in Lemma 8 (for (ε, 0)-privacy) and Lemma 11 (for (ε, δ)-privacy).
Researcher Affiliation Academia Alina Ene,1 Huy L. Nguyen, 2 Adrian Vladu 3 1 Department of Computer Science, Boston University 2 Khoury College of Computer and Information Science, Northeastern University 3 CNRS & IRIF, Universit e de Paris
Pseudocode Yes Algorithm 1 PRIVATEBANDIT(T, P, D) input time horizon T, symmetric noise distribution P, diameter of domain D. 1: Tround = T 1/2, Tbatch = T Tround , η = D T 3/4n1/2L, ζ = T 1/4 .
Open Source Code No No explicit statement or link to open-source code for the methodology described in this paper.
Open Datasets No The paper is theoretical and does not use or specify any publicly available datasets.
Dataset Splits No The paper is theoretical and does not discuss dataset splits for training, validation, or testing.
Hardware Specification No The paper is theoretical and does not describe hardware used for experiments.
Software Dependencies No The paper discusses algorithms and mechanisms (e.g., 'Gaussian and Laplacian mechanisms', 'tree based aggregation') but does not specify any software dependencies with version numbers.
Experiment Setup No The paper is theoretical and defines algorithmic parameters (e.g., 'Tround = T 1/2, Tbatch = T Tround , η = D T 3/4n1/2L, ζ = T 1/4'), but does not provide specific experimental setup details such as hyperparameters for training empirical models.