Private Learning Implies Online Learning: An Efficient Reduction
Authors: Alon Gonen, Elad Hazan, Shay Moran
NeurIPS 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | In this paper we resolve this open question in the context of pure differential privacy. We derive an efficient black-box reduction from differentially private learning to online learning from expert advice. |
| Researcher Affiliation | Collaboration | Alon Gonen University of California San Diego algonen@cs.ucsd.edu Elad Hazan Princeton University and Google AI Princeton ehazan@princeton.edu Shay Moran Google AI Princeton shaymoran1@gmail.com |
| Pseudocode | Yes | Algorithm 1 Weak online learner for oblivious adversaries |
| Open Source Code | No | The paper does not provide any statements about releasing source code or links to a code repository. |
| Open Datasets | No | The paper is a theoretical work and does not describe or use specific datasets for empirical training or evaluation. |
| Dataset Splits | No | The paper is theoretical and does not mention specific training, validation, or test dataset splits. |
| Hardware Specification | No | The paper is theoretical and does not mention any specific hardware used for experiments. |
| Software Dependencies | No | The paper is theoretical and does not specify any software dependencies with version numbers. |
| Experiment Setup | No | The paper is theoretical and does not detail any experimental setup, hyperparameters, or training configurations. |