Private Non-smooth ERM and SCO in Subquadratic Steps

Authors: Janardhan Kulkarni, Yin Tat Lee, Daogao Liu

NeurIPS 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical We study the differentially private Empirical Risk Minimization (ERM) and Stochastic Convex Optimization (SCO) problems for non-smooth convex functions. We get a (nearly) optimal bound on the excess empirical risk for ERM with O( N 3/2 d ) gradient queries...Combining this result with the iterative localization technique of Feldman et al. [FKT20], we achieve the optimal excess population loss for the SCO problem with O(min{N 5/4d1/8, N3/2 d1/8 }) gradient queries. Our work makes progress towards resolving a question raised by Bassily et al. [BFGT20], giving first algorithms for private SCO with subquadratic steps.
Researcher Affiliation Collaboration Janardhan Kulkarni Algorithms Group, Microsoft Research (MSR) Redmond. jakul@microsoft.com Yin Tat Lee University of Washington and Microsoft Research. Supported by NSF awards CCF-1749609, DMS1839116, DMS-2023166, Microsoft Research Faculty Fellowship, Sloan Research Fellowship, Packard Fellowships.yintat@uw.edu Daogao Liu University of Washington. Part of the work was done while visiting Shanghai Qi Zhi Institute.dgliu@uw.edu.
Pseudocode Yes Algorithm 1: Private Meta Algorithm METADP" and "Algorithm 2: Accelerated stochastic approximation (AC-SA) algorithm" and "Algorithm 3: Private AC SA
Open Source Code No The paper does not provide any concrete access information (e.g., repository link, explicit code release statement) for the source code of the methodology described.
Open Datasets No The paper discusses abstract 'sample sets' drawn from an 'unknown distribution P' but does not refer to any specific publicly available datasets.
Dataset Splits No The paper does not describe empirical experiments with data, and therefore, does not provide information about training/validation/test dataset splits.
Hardware Specification No The paper does not describe empirical experiments that would require hardware specifications.
Software Dependencies No The paper does not describe empirical experiments that would require specific software dependencies with version numbers.
Experiment Setup No The paper does not describe empirical experiments, and thus, provides no specific details about an experimental setup, hyperparameters, or system-level training settings.