Improved Analysis of Sparse Linear Regression in Local Differential Privacy Model

Authors: Liyang Zhu, Meng Ding, Vaneet Aggarwal, Jinhui Xu, Di Wang

ICLR 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical In this paper, we revisit the problem of sparse linear regression in the local differential privacy (LDP) model. ... We propose an innovative NLDP algorithm, the very first of its kind for the problem. ... Our algorithm achieves an upper bound of O( dk nϵ) for the estimation error when the data is sub-Gaussian... For the sequentially interactive LDP model, we show a similar lower bound of Ω( dk nϵ). As for the upper bound, we rectify a previous method and show that it is possible to achieve a bound of O( k d nϵ).
Researcher Affiliation Academia Liyang Zhu1 , Meng Ding2 , Vaneet Aggarwal3 , Jinhui Xu2 , Di Wang1 1PRADA Lab, King Abdullah University of Science and Technology 2State University of New York at Buffalo 3Purdue University
Pseudocode Yes Algorithm 1 Non-interactive LDP algorithm for Sparse Linear Regression (page 5) Algorithm 2 LDP Iterative Hard Thresholding (page 7) Algorithm 3 Non-interactive LDP algorithm for Sparse Linear Regression with public but unlabeled data (page 9) Algorithm 4 LDP Iterative Hard Thresholding (page 24)
Open Source Code No The paper does not include any statements about releasing source code, providing a repository link, or making the code available in supplementary materials for the methodology described.
Open Datasets No The paper is theoretical and focuses on mathematical proofs and algorithm design under data assumptions (e.g., 'sub-Gaussian data', 'heavy-tailed responses'), rather than using specific, publicly available datasets for empirical training.
Dataset Splits No The paper is theoretical and does not report on empirical experiments, therefore it does not specify train/validation/test dataset splits.
Hardware Specification No The paper is theoretical and does not report on empirical experiments, therefore no specific hardware specifications are provided.
Software Dependencies No The paper is theoretical and does not report on empirical experiments, therefore no software dependencies with version numbers are specified.
Experiment Setup No The paper is theoretical and does not report on empirical experiments, therefore no specific experimental setup details like hyperparameters or training configurations are provided.