Private Distribution Learning with Public Data: The View from Sample Compression
Authors: Shai Ben-David, Alex Bie, Clément L Canonne, Gautam Kamath, Vikrant Singhal
NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | We study the problem of private distribution learning with access to public data. We show that the public-private learnability of a class Q is connected to the existence of a sample compression scheme for Q, as well as to an intermediate notion we refer to as list learning. Our work investigates the sample complexity of public-private learning, and does not give computationally efficient learners, or in some cases, algorithmic learners that run in finite time. |
| Researcher Affiliation | Academia | Shai Ben-David University of Waterloo Vector Institute shai@uwaterloo.ca Alex Bie University of Waterloo yabie@uwaterloo.ca Cl ement L. Canonne University of Sydney clement.canonne@sydney.edu.au Gautam Kamath University of Waterloo Vector Institute g@csail.mit.edu Vikrant Singhal University of Waterloo vikrant.singhal@uwaterloo.ca |
| Pseudocode | No | The paper does not include any structured pseudocode or algorithm blocks. Algorithmic steps are described in prose. |
| Open Source Code | No | The paper does not contain any explicit statements about releasing open-source code for the described methodology, nor does it provide any links to a code repository. |
| Open Datasets | No | The paper is theoretical and does not conduct empirical experiments that would involve using a publicly available dataset. No specific dataset is mentioned as being used for training or evaluation. |
| Dataset Splits | No | The paper is theoretical and does not describe empirical experiments. Therefore, there are no mentions of training, validation, or test dataset splits. |
| Hardware Specification | No | The paper is theoretical and does not conduct empirical experiments, therefore, no specific hardware specifications are mentioned. |
| Software Dependencies | No | The paper is theoretical and does not describe any computational experiments, thus it does not list any specific software dependencies with version numbers. |
| Experiment Setup | No | The paper is theoretical and does not describe any empirical experimental setup, including hyperparameters or system-level training settings. |