Differentially Private Domain Adaptation with Theoretical Guarantees
Authors: Raef Bassily, Corinna Cortes, Anqi Mao, Mehryar Mohri
ICML 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | While our main objective is a theoretical analysis, we also report the results of several experiments. We first show that the non-private versions of our algorithms match state-of-the-art performance in supervised adaptation and that for larger values of the target sample size or ε, the performance of our private algorithms remains close to that of their non-private counterparts. |
| Researcher Affiliation | Collaboration | Raef Bassily 1 Corinna Cortes 2 Anqi Mao 3 Mehryar Mohri 2 3 1The Ohio State University 2Google Research, New York, NY; 3Courant Institute of Mathematical Sciences, New York, NY. Correspondence to: Anqi Mao <aqmao@cims.nyu.edu>. |
| Pseudocode | Yes | Algorithm 1 Cnvx Adap Private adaptation algorithm based on F |
| Open Source Code | No | The paper does not contain any explicit statement about releasing source code or a link to a code repository. |
| Open Datasets | Yes | We consider five regression datasets with dimensions as high as 384 from the UCI machine learning repository (Dua & Graff, 2017), the Wind, Airline, Gas, News and Slice. |
| Dataset Splits | Yes | We carry out model selection on the target validation set and report in Table 1 the mean and standard deviation on the test set over 10 random splits of the target training and validation sets. |
| Hardware Specification | No | The paper does not provide specific hardware details such as GPU/CPU models or processor types used for running its experiments. |
| Software Dependencies | No | The paper mentions using logistic regression classifiers but does not specify any software dependencies with version numbers. |
| Experiment Setup | Yes | For details on hyperparameter tuning see Appendix E. |