Sample-Efficient Private Learning of Mixtures of Gaussians

Authors: Hassan Ashtiani, Mahbod Majid, Shyam Narayanan

NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical Our work purely focuses on sample complexity, and as noted in Theorems 1.4 and 1.5, they do not have polynomial time algorithms. We note that the previous works of [AAL21, AAL24b] also do not run in polynomial time. Indeed, there is reason to believe that even non-privately, it is impossible to learn GMMs in polynomial time (in terms of the optimal sample complexity) [DKS17, BRST21, GVV22].Our results are on theoretical guarantees on the sample complexity of privately learning Mixtures of Gaussians. We do not provide any efficient or practical algorithms, and focus on statistical guarantees.
Researcher Affiliation Collaboration Hassan Ashtiani Mc Master University zokaeiam@mcmaster.ca Mahbod Majid MIT mahbod@mit.edu Shyam Narayanan Citadel Securities shyam.s.narayanan@gmail.com
Pseudocode Yes Algorithm 1: FINE-ESTIMATE(X1, X2, . . . , Xn Rd, d, k, k , ε, α, G, {(ˆµj, ˆΣj)}j k ) Algorithm 2: ESTIMATE(X1, X2, . . . , Xn+n Rd, k, ε, δ, α) Algorithm 3: ESTIMATE1D(X1, X2, . . . , Xn+n R, k, ε, δ, α)
Open Source Code No Our results are on theoretical guarantees on the sample complexity of privately learning Mixtures of Gaussians. We do not provide any efficient or practical algorithms, and focus on statistical guarantees. Our paper does not release any new assets.
Open Datasets No No experiments.
Dataset Splits No No experiments.
Hardware Specification No No experiments.
Software Dependencies No No experiments.
Experiment Setup No No experiments.