The Price of Privacy for Low-rank Factorization
Authors: Jalaj Upadhyay
NeurIPS 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | In this section, we give a glimpse of our experimental evaluations of additive error and compare it with the best known results. The details and discussion of our empirical evaluations is in supplementary materials. Two important parameters in our bounds are k and α Hardt and Roth [30] consider a constant α. Therefore, we analyze the additive error with respect to the change in α in order to better understand the effect of differential privacy on low space low-rank approximation of matrices. The result of our experiment is presented in Figure 1 ((a)-(d)) with the scale of y-axis (accuracy) in logarithmic to better illustrate the accuracy improvement shown by our algorithm. In both these experiments, we see that the additive error incurred by our algorithm is less than the additive error incurred by Hardt and Roth [30]. We note that the matrices are highly incoherent as all the entries are sampled i.i.d. We also consider the role of k in our locally-private algorithm. The results of our experiment in presented in Figure 1 ((e)-(f)). The error of our algorithm is consistently less than the expected error. |
| Researcher Affiliation | Academia | Jalaj Upadhyay Johns Hopkins University Baltimore, MD 21201, USA. jalaj@jhu.edu |
| Pseudocode | Yes | Algorithm 1 PRIVATE-OPTIMAL-LRF(A; (ϵ, δ); α; k) |
| Open Source Code | No | The paper does not include an explicit statement about releasing source code or a link to a code repository for the methodology described. |
| Open Datasets | No | In this section, we give a glimpse of our experimental evaluations of additive error and compare it with the best known results... We note that the matrices are highly incoherent as all the entries are sampled i.i.d. |
| Dataset Splits | No | The paper does not explicitly provide details about dataset splits (training, validation, test). |
| Hardware Specification | No | The paper does not explicitly describe the hardware used to run its experiments with specific models or specifications. |
| Software Dependencies | No | The paper does not provide specific software dependencies with version numbers. |
| Experiment Setup | No | Two important parameters in our bounds are k and α Hardt and Roth [30] consider a constant α. Therefore, we analyze the additive error with respect to the change in α in order to better understand the effect of differential privacy on low space low-rank approximation of matrices. |