Faster Differentially Private Samplers via Rényi Divergence Analysis of Discretized Langevin MCMC
Authors: Arun Ganesh, Kunal Talwar
NeurIPS 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | In this work, we establish rapid convergence for these algorithms under distance measures more suitable for differential privacy. For smooth, strongly-convex f, we give the first results proving convergence in Rényi divergence. This gives us fast differentially private algorithms for such f. Our techniques and simple and generic and apply also to underdamped Langevin dynamics. |
| Researcher Affiliation | Collaboration | Arun Ganesh Department of EECS UC Berkeley Berkeley, CA 94709 arunganesh@berkeley.edu Kunal Talwar Apple Cupertino, CA 95014 ktalwar@apple.com |
| Pseudocode | No | The paper describes algorithmic steps using mathematical equations and textual descriptions, but no formal pseudocode or algorithm blocks are present. |
| Open Source Code | No | The paper does not provide any statement about releasing source code or a link to a code repository. |
| Open Datasets | No | The paper is theoretical and does not conduct experiments on datasets, thus no training dataset information is provided. |
| Dataset Splits | No | The paper is theoretical and does not conduct experiments on datasets, thus no dataset split information for validation is provided. |
| Hardware Specification | No | The paper is theoretical and does not describe any specific hardware used for experiments. |
| Software Dependencies | No | The paper is theoretical and does not list any specific software dependencies with version numbers. |
| Experiment Setup | No | The paper is theoretical and does not describe any experimental setup details or hyperparameters. |