Facility Location Problem in Differential Privacy Model Revisited
Authors: Yunus Esencayi, Marco Gaboardi, Shi Li, Di Wang
NeurIPS 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | In this paper we study the uncapacitated facility location problem in the model of differential privacy (DP) with uniform facility cost. Specifically, we first show that, under the hierarchically well-separated tree (HST) metrics and the super-set output setting that was introduced in [8], there is an ϵ-DP algorithm that achieves an O( 1ϵ ) (expected multiplicative) approximation ratio; this implies an O( log nϵ ) approximation ratio for the general metric case, where n is the size of the input metric. These bounds improve the best-known results given by [8]. |
| Researcher Affiliation | Academia | Yunus Esencayi SUNY at Buffalo yunusese@buffalo.edu Marco Gaboardi Boston University gaboardi@bu.edu Shi Li SUNY at Buffalo shil@buffalo.edu Di Wang SUNY at Buffalo dwang45@buffalo.edu |
| Pseudocode | Yes | Algorithm 1 UFL-tree-base(ϵ) and Algorithm 2 DP-UFL-tree(ϵ) |
| Open Source Code | No | The paper does not provide any concrete access information (e.g., a specific link or explicit statement of code release) for open-source code. |
| Open Datasets | No | This is a theoretical paper focused on algorithms and proofs, not empirical evaluation on datasets. Therefore, it does not mention training data availability. |
| Dataset Splits | No | This is a theoretical paper focused on algorithms and proofs, not empirical evaluation on datasets. Therefore, it does not mention validation splits. |
| Hardware Specification | No | This is a theoretical paper and does not describe experiments that would require hardware specifications. |
| Software Dependencies | No | This is a theoretical paper and does not mention specific software dependencies with version numbers. |
| Experiment Setup | No | This is a theoretical paper and does not describe experimental setups with hyperparameters or training configurations. |