DP-Fast MH: Private, Fast, and Accurate Metropolis-Hastings for Large-Scale Bayesian Inference
Authors: Wanrong Zhang, Ruqi Zhang
ICML 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We empirically demonstrate the effectiveness and efficiency of our algorithm in various experiments. |
| Researcher Affiliation | Academia | 1Harvard John A. Paulson School of Engineering and Applied Sciences 2Department of Computer Science, Purdue University |
| Pseudocode | Yes | Algorithm 1 Tuna MH |
| Open Source Code | Yes | We released the code at https://github.com/ruqizhang/dpfastmh. |
| Open Datasets | Yes | Truncated Gaussian mixture. We first test DP-fast MH on a two-dimensional truncated Gaussian mixture. Following previous work (Welling & Teh, 2011; Heikkil a et al., 2019; Zhang et al., 2020), the data is generated as follows... |
| Dataset Splits | No | MNIST with only 7s and 9s images contains 12214 training data and 2037 test data. |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., GPU/CPU models, memory) used for running experiments. |
| Software Dependencies | No | The paper does not provide specific software dependencies with version numbers. |
| Experiment Setup | Yes | In all experiments, we set the privacy parameter δ = 10 5. In Section 6.1, we compare DP-fast MH with existing methods under two settings: truncated Gaussian mixture and logistic regression on the MNIST dataset . In Section 6.2, we discuss the effect of the newly introduced hyperparameter K. For other hyperparameters in the algorithm which are already presented in the baseline method Tuna MH, we set the values following Zhang et al. (2020). We tune the stepsize of each method to reach the acceptance rate 60% and set K to be around ϵC/ maxi ci. |