Uniform Deviation Bounds for k-Means Clustering
Authors: Olivier Bachem, Mario Lucic, S. Hamed Hassani, Andreas Krause
ICML 2017 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | In this paper, we provide a novel framework to obtain uniform deviation bounds for unbounded loss functions. As a result, we obtain competitive uniform deviation bounds for k-Means clustering under weak assumptions on the underlying distribution. If the fourth moment is bounded, we prove a rate of O pared to the previously known O rate. We further show that this rate also depends on the kurtosis the normalized fourth moment which measures the tailedness of the distribution. We also provide improved rates under progressively stronger assumptions, namely, bounded higher moments, subgaussianity and bounded support of the underlying distribution. |
| Researcher Affiliation | Academia | 1Department of Computer Science, ETH Zurich. Correspondence to: Olivier Bachem <olivier.bachem@inf.ethz.ch>. |
| Pseudocode | No | The paper does not contain any pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not provide any information or links regarding the availability of open-source code for the methodology described. |
| Open Datasets | No | The paper focuses on theoretical bounds for distributions (P) and samples (Xm) from them, not specific publicly available datasets for training models. |
| Dataset Splits | No | The paper is theoretical and does not describe empirical experiments with training, validation, or test dataset splits. |
| Hardware Specification | No | The paper is theoretical and does not mention any hardware specifications used for experiments. |
| Software Dependencies | No | The paper is theoretical and does not list any specific software dependencies with version numbers. |
| Experiment Setup | No | The paper is theoretical and does not describe any specific experimental setup details, such as hyperparameters or training configurations. |