KNG: The K-Norm Gradient Mechanism
Authors: Matthew Reimherr, Jordan Awan
NeurIPS 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | In addition to theoretical guarantees on privacy and utility, we confirm the utility of KNG empirically in the settings of linear and quantile regression through simulations. |
| Researcher Affiliation | Academia | Matthew Reimherr Department of Statistics Pennsylvania State University State College, PA 16802 mreimherr@psu.edu Jordan Awan Department of Statistics Pennsylvania State University State College, PA 16802 awan@psu.edu |
| Pseudocode | Yes | Algorithm 1 Regression Simulation |
| Open Source Code | No | The paper discusses the implementation of sampling procedures (e.g., MCMC) but does not provide any link or explicit statement about releasing the source code for the methodology described. |
| Open Datasets | No | The paper describes generating synthetic data for simulations using specific distributions (e.g., Xij iid U(-1, 1), errors ei N(0, 1)) rather than using a publicly available or open dataset. |
| Dataset Splits | No | The paper conducts simulations by generating data for each replicate but does not describe the use of explicit training, validation, and test dataset splits from a pre-existing dataset. |
| Hardware Specification | No | The paper does not provide any specific details about the hardware (e.g., CPU, GPU models, memory) used to run the experiments. |
| Software Dependencies | No | The paper mentions using a 'one-at-a-time MCMC procedure' but does not specify any software names with version numbers (e.g., Python, PyTorch, TensorFlow, or specific statistical packages) used for the implementation. |
| Experiment Setup | Yes | For each n in 10^2, 10^3, 10^4, . . . , 10^7 we run 100 replicates of Algorithm 1 at ϵ = 1. For KNG and exponential mechanism, we draw samples using a one-at-a-time MCMC procedure with 10000 steps. |