Learning Mixtures of Gaussians with Censored Data
Authors: Wai Ming Tai, Bryon Aragam
ICML 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | We study the problem of learning mixtures of Gaussians with censored data. Statistical learning with censored data is a classical problem, with numerous practical applications, however, finitesample guarantees for even simple latent variable models such as Gaussian mixtures are missing. Formally, we are given censored data from a mixture of univariate Gaussians i=1 wi N(µi, σ2), i.e. the sample is observed only if it lies inside a set S. The goal is to learn the weights wi and the means µi. We propose an algorithm that takes only 1 εO(k) samples to estimate the weights wi and the means µi within ε error. |
| Researcher Affiliation | Academia | 1Booth School of Business, University of Chicago, Chicago, USA. Correspondence to: Wai Ming Tai <waiming.tai@chicagobooth.edu>, Bryon Aragam <bryon@chicagobooth.edu>. |
| Pseudocode | Yes | Algorithm 1 Learning mixtures of Gaussians with censored data |
| Open Source Code | No | The paper does not provide any explicit statement about releasing open-source code for their methodology or a link to a repository. |
| Open Datasets | No | The paper is theoretical and does not use or reference any datasets for training. Thus, no information about public availability of datasets is provided. |
| Dataset Splits | No | The paper is theoretical and does not involve data splitting for training, validation, or testing. Therefore, no information on dataset splits is provided. |
| Hardware Specification | No | The paper is theoretical and does not describe any experimental setup or hardware used. |
| Software Dependencies | No | The paper is theoretical and does not specify any software dependencies or versions. |
| Experiment Setup | No | The paper is theoretical and does not describe an experimental setup or hyperparameter details. |