Locally Private Gaussian Estimation
Authors: Matthew Joseph, Janardhan Kulkarni, Jieming Mao, Steven Z. Wu
NeurIPS 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | We match these upper bounds with an information-theoretic lower bound showing that our accuracy guarantees are tight up to logarithmic factors for all sequentially interactive locally private protocols. |
| Researcher Affiliation | Collaboration | Matthew Joseph University of Pennsylvania majos@cis.upenn.edu Janardhan Kulkarni Microsoft Research Redmond jakul@microsoft.com Jieming Mao Google Research New York maojm@google.com Zhiwei Steven Wu University of Minnesota zsw@umn.edu |
| Pseudocode | Yes | Algorithm 1 KVGAUSSTIMATE, Algorithm 2 KVAGG1, Algorithm 3 ESTMEAN, Algorithm 4 KVRR2, Algorithm 5 KVAGG2, Algorithm 6 UVGAUSSTIMATE |
| Open Source Code | No | The paper does not provide any statements or links regarding the public availability of source code for the described methodology. |
| Open Datasets | No | The paper is theoretical and focuses on estimation problem with i.i.d. samples from a Gaussian distribution, without referring to any specific publicly available dataset or providing access information for one. |
| Dataset Splits | No | The paper is theoretical and does not involve empirical experiments requiring dataset splits for training, validation, or testing. |
| Hardware Specification | No | The paper is theoretical and does not describe any specific hardware used for running experiments. |
| Software Dependencies | No | The paper is theoretical and does not specify any software dependencies with version numbers. |
| Experiment Setup | No | The paper is theoretical and focuses on algorithm design and proofs rather than empirical experimental setup details like hyperparameters or training configurations. |