Derandomizing Multi-Distribution Learning
Authors: Kasper Green Larsen, Omar Montasser, Nikita Zhivotovskiy
NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | The paper is theoretical, and we have no experiments, data or code in the paper. |
| Researcher Affiliation | Academia | Kasper Green Larsen Department of Computer Science Aarhus University larsen@cs.au.dk Omar Montasser Department of Statistics and Data Science Yale University omar.montasser@yale.edu Nikita Zhivotovskiy Department of Statistics University of California, Berkeley zhivotovskiy@berkeley.edu |
| Pseudocode | Yes | Algorithm 1: DETERMINISTICLEARNER(P, ε, δ, A) |
| Open Source Code | No | The paper is theoretical, and we have no experiments, data or code in the paper. |
| Open Datasets | No | The paper is theoretical, and we have no experiments, data or code in the paper. |
| Dataset Splits | No | The paper is theoretical, and we have no experiments, data or code in the paper. |
| Hardware Specification | No | The paper is theoretical, and we have no experiments, data or code in the paper. |
| Software Dependencies | No | The paper is theoretical, and we have no experiments, data or code in the paper. |
| Experiment Setup | No | The paper is theoretical, and we have no experiments, data or code in the paper. |