Differentially Private Clustering: Tight Approximation Ratios
Authors: Badih Ghazi, Ravi Kumar, Pasin Manurangsi
NeurIPS 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | We study the task of differentially private clustering. For several basic clustering problems, including Euclidean Densest Ball, 1-Cluster, k-means, and k-median, we give efficient differentially private algorithms that achieve essentially the same approximation ratios as those that can be obtained by any non-private algorithm, while incurring only small additive errors. |
| Researcher Affiliation | Industry | Badih Ghazi Google Research Mountain View, CA badihghazi@gmail.com; Ravi Kumar Google Research Mountain View, CA ravi.k53@gmail.com; Pasin Manurangsi Google Research Mountain View, CA pasin@google.com |
| Pseudocode | Yes | Algorithm 1: DENSESTBALL (x1, . . . , xn; r, ) |
| Open Source Code | No | The paper does not contain any statements about releasing open-source code for the described methodology or provide a link to a code repository. |
| Open Datasets | No | The paper defines abstract input data ('a set X of n points, each contained in the d-dimensional unit ball') for its theoretical algorithms but does not mention the use of any specific publicly available or open datasets for empirical training or evaluation. |
| Dataset Splits | No | The paper is theoretical and does not report on empirical experiments, therefore there are no dataset splits for training, validation, or testing mentioned. |
| Hardware Specification | No | The paper describes theoretical algorithms and their properties, but it does not specify any hardware used for running experiments as no empirical experiments are reported. |
| Software Dependencies | No | The paper is theoretical and focuses on algorithm design and analysis, and thus does not specify any software dependencies with version numbers for reproducing empirical experiments. |
| Experiment Setup | No | The paper is theoretical and presents algorithm design and analysis; therefore, it does not include details about an experimental setup, hyperparameters, or training configurations. |