Differentially Private Distributed Bayesian Linear Regression with MCMC

Authors: Baris Alparslan, Sinan Yıldırım, Ilker Birbil

ICML 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We provide numerical results on both real and simulated data, which demonstrate that the proposed algorithms provide well-rounded estimation and prediction. We present several numerical evaluations of the proposed methods, MCMC-normal X, MCMC-fixed S, and Bayes-fixed S-fast, with simulated and real data.
Researcher Affiliation Academia 1Faculty of Engineering and Sciences, Sabancı University, Turkey 2Amsterdam Business School, University of Amsterdam, The Netherlands.
Pseudocode Yes Algorithm 1 MCMC-normal X one iteration, Algorithm 2 MCMC-fixed S one iteration, Algorithm 3 Bayes-fixed S-fast
Open Source Code Yes Code for the experiments: Link to the code and the data for the experiments: https://github.com/ sinanyildirim/Bayesian_DP_dist_LR.git.
Open Datasets Yes For the real data case, we use four different data sets from the UCI Machine Learning Repository. power plant energy 7655 4 view link bike sharing 13904 14 view link air quality 7486 12 view link 3droad 347900 3 view link
Dataset Splits No For prediction, we took 80% of the data for training and the rest for testing.
Hardware Specification Yes The algorithms were run in MATLAB 2021b on an Apple M1 chip with 8 cores and 16 GB LPDDR4 memory.
Software Dependencies Yes The algorithms were run in MATLAB 2021b on an Apple M1 chip with 8 cores and 16 GB LPDDR4 memory.
Experiment Setup Yes For inference, we used the same Λ, κ as above and a = 20, b = 0.5, m = 0d 1, C = b/(a 1)Id. All the MCMC algorithms were run for 10^4 iterations. For each (J, ϵ) pair, we ran each method for 50 times (each with different noisy observations) to obtain average performances.