Differentially Private Bayesian Optimization
Authors: Matt Kusner, Jacob Gardner, Roman Garnett, Kilian Weinberger
ICML 2015 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | we prove that under a GP assumption these private quantities are often near-optimal. Finally, even if this assumption is not satisfied, we can use different smoothness guarantees to protect privacy.Our privacy guarantees hold for releasing the best hyperparameters and best validation gain. Specifically our contributions are as follows: 1. We derive, to the best of our knowledge, the first framework for Bayesian optimization with differential privacy guarantees, with/without oberservation noise, 2. We show that even if our validation gain is not drawn from a Gaussian process, we can guarantee differential privacy under different smoothness assumptions. |
| Researcher Affiliation | Academia | Matt J. Kusner MKUSNER@WUSTL.EDU Jacob R. Gardner GARDNER.JAKE@WUSTL.EDU Roman Garnett GARNETT@WUSTL.EDU Kilian Q. Weinberger KILIAN@WUSTL.EDU Washington University in St. Louis, 1 Brookings Dr., St. Louis, MO 63130 |
| Pseudocode | Yes | Algorithm 1 Private Bayesian Opt. (noisy observations)Algorithm 2 Private Bayesian Opt. (noise free obs.)Algorithm 3 Private Bayesian Opt. (Lipschitz and convex) |
| Open Source Code | No | The paper does not provide any concrete access information for open-source code. |
| Open Datasets | No | The paper does not provide concrete access information (specific link, DOI, repository name, formal citation with authors/year, or reference to established benchmark datasets) for a publicly available or open dataset. It discusses "validation dataset" generally. |
| Dataset Splits | No | The paper does not provide specific dataset split information (exact percentages, sample counts, citations to predefined splits, or detailed splitting methodology) needed to reproduce the data partitioning. It discusses "validation dataset" conceptually. |
| Hardware Specification | No | The paper does not provide specific hardware details (exact GPU/CPU models, processor types with speeds, memory amounts, or detailed computer specifications) used for running its experiments. |
| Software Dependencies | No | The paper does not provide specific ancillary software details (e.g., library or solver names with version numbers like Python 3.8, CPLEX 12.4) needed to replicate the experiment. |
| Experiment Setup | No | The paper does not contain specific experimental setup details (concrete hyperparameter values, training configurations, or system-level settings) in the main text, as it is a theoretical paper without empirical experiments. |