Agnostically Learning Single-Index Models using Omnipredictors
Authors: Aravind Gollakota, Parikshit Gopalan, Adam Klivans, Konstantinos Stavropoulos
NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | Our analysis is simple and relies on the relationship between Bregman divergences (or matching losses) and ℓp distances. We also provide new guarantees for standard algorithms like GLMtron and logistic regression in the agnostic setting. Our main result is the first efficient learning algorithm with a guarantee of this form. |
| Researcher Affiliation | Collaboration | Aravind Gollakota Apple Parikshit Gopalan Apple Adam R. Klivans UT Austin Konstantinos Stavropoulos UT Austin |
| Pseudocode | Yes | Algorithm 1: Calibrated Multiaccuracy (modification of Algorithm 2 in Gopalan et al. [2023]) |
| Open Source Code | No | The paper does not provide any statement or link regarding the availability of open-source code for the described methodology. |
| Open Datasets | No | The paper is theoretical and does not use or describe a specific public dataset for empirical evaluation. It refers to abstract 'distributions' rather than concrete datasets. |
| Dataset Splits | No | The paper is theoretical and does not involve empirical experiments with datasets, thus no dataset split information (train/validation/test) is provided. |
| Hardware Specification | No | The paper focuses on theoretical aspects and does not describe any specific hardware used for experiments. |
| Software Dependencies | No | The paper is theoretical and does not mention any specific software dependencies with version numbers required for reproducibility. |
| Experiment Setup | No | The paper is theoretical and does not describe any experimental setup details such as hyperparameters or training configurations for empirical evaluation. |