Optimal Query Complexity of Secure Stochastic Convex Optimization

Authors: Wei Tang, Chien-Ju Ho, Yang Liu

NeurIPS 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical We study the secure stochastic convex optimization problem. A learner aims to learn the optimal point of a convex function through sequentially querying a (stochastic) gradient oracle. In the meantime, there exists an adversary who aims to free-ride and infer the learning outcome of the learner from observing the learner s queries. The adversary observes only the points of the queries but not the feedback from the oracle. The goal of the learner is to optimize the accuracy, i.e., obtaining an accurate estimate of the optimal point, while securing her privacy, i.e., making it difficult for the adversary to infer the optimal point. We formally quantify this tradeoff between learner s accuracy and privacy and characterize the lower and upper bounds on the learner s query complexity as a function of desired levels of accuracy and privacy. For the analysis of lower bounds, we provide a general template based on information theoretical analysis and then tailor the template to several families of problems, including stochastic convex optimization and (noisy) binary search. We also present a generic secure learning protocol that achieves the matching upper bound up to logarithmic factors.
Researcher Affiliation Academia Wei Tang , Chien-Ju Ho , and Yang Liu Washington University in St. Louis, UC Santa Cruz {w.tang, chienju.ho}@wustl.edu, yangliu@ucsc.edu
Pseudocode Yes The details of our secure learning protocol, and together with an example of computation oracle, are included in Appendix B.4. (Referring to Appendix B.4, which contains 'Algorithm 1: Secure Learning Protocol' and 'Algorithm 2: Computation Oracle').
Open Source Code No The paper discusses theoretical bounds and algorithms. It does not mention releasing any source code or provide links to repositories.
Open Datasets No The paper is theoretical and does not mention specific datasets or their public availability for training purposes.
Dataset Splits No The paper is theoretical and does not describe any dataset splits for validation.
Hardware Specification No The paper is theoretical and does not describe any experiments that would require specific hardware specifications.
Software Dependencies No The paper is theoretical and does not list any specific software dependencies with version numbers.
Experiment Setup No The paper is theoretical and does not describe an experimental setup with specific hyperparameters or training configurations.