Relative Deviation Margin Bounds
Authors: Corinna Cortes, Mehryar Mohri, Ananda Theertha Suresh
ICML 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | We present a series of new and more favorable margin-based learning guarantees that depend on the empirical margin loss of a predictor. We give two types of learning bounds, in terms of either the Rademacher complexity or the empirical ℓ covering number of the hypothesis set used, both distribution-dependent and valid for general families. Furthermore, using our relative deviation margin bounds, we derive distribution-dependent generalization bounds for unbounded loss functions under the assumption of a finite moment. We also briefly highlight several applications of these bounds and discuss their connection with existing results. |
| Researcher Affiliation | Collaboration | Corinna Cortes 1 Mehryar Mohri 1 2 Ananda Theertha Suresh 1 1Google Research, New York, NY; 2Courant Institute of Mathematical Sciences, New York, NY;. |
| Pseudocode | No | The paper presents mathematical proofs and theoretical derivations. It does not include any pseudocode or algorithm blocks. |
| Open Source Code | No | The paper describes theoretical learning bounds and does not mention releasing any open-source code for an implemented methodology. |
| Open Datasets | No | The paper is theoretical and focuses on deriving learning bounds. It does not describe experiments that involve training on a public dataset. |
| Dataset Splits | No | The paper is theoretical and does not involve practical experiments with datasets, and thus, no training/validation/test splits are mentioned. |
| Hardware Specification | No | This is a theoretical paper that does not involve computational experiments, so no hardware specifications are provided. |
| Software Dependencies | No | This is a theoretical paper that does not conduct experiments, and therefore, no software dependencies with version numbers are listed. |
| Experiment Setup | No | This is a theoretical paper presenting new bounds and proofs. It does not describe any experimental setup details such as hyperparameters or training configurations. |