Differentially Private Model Personalization

Authors: Prateek Jain, John Rush, Adam Smith, Shuang Song, Abhradeep Guha Thakurta

NeurIPS 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We give new algorithms for this setting, analyze their accuracy on specific data distributions, and test our results empirically.
Researcher Affiliation Collaboration Prateek Jain Google Research prajain@google.com Keith Rush Google Research krush@google.com Adam Smith Boston University ads22@bu.edu Shuang Song Google Research shuangsong@google.com Abhradeep Thakurta Google Research athakurta@google.com
Pseudocode Yes Algorithm 1 APriv-Alt Min: Differentially Private Alternating Minimization Meta-algorithm
Open Source Code No The paper does not provide a statement or link indicating that the source code for the methodology is openly available.
Open Datasets No The paper mentions using 'synthetic data' but does not provide any link, DOI, or formal citation for public access to this data.
Dataset Splits No The paper describes generating synthetic data and evaluating population MSE, but it does not specify explicit training, validation, or test dataset splits in terms of percentages or sample counts for overall model evaluation.
Hardware Specification No The paper does not provide specific details about the hardware (e.g., GPU/CPU models, memory) used for running the experiments.
Software Dependencies No The paper does not provide specific software dependencies with version numbers needed to replicate the experiment.
Experiment Setup Yes We set the number of users n = 50, 000, number of samples per user m = 10, data dimension d = 50 and rank k = 2. We sample x, U and v from Gaussian distributions, and the field noise σF of target y is set to be 0.01. We normalize U to unit norm. We run Algorithm 1 with full batch, i.e., T = 1 and for multiple epochs. [...] We fix the clipping norm to be 10 4 and pick the optimal the number of epochs in {1, 2, 5, 10}.