Learning to Optimize Computational Resources: Frugal Training with Generalization Guarantees
Authors: Maria-Florina Balcan, Tuomas Sandholm, Ellen Vitercik3227-3234
AAAI 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | We present an algorithm that identifies a finite set of promising parameters within an infinite set, given sample access to a distribu- tion over problem instances. We prove that this set contains a nearly optimal parameter with high probability. The set can serve as the input to a configuration algorithm for finite parameter spaces... We prove that our sample complexity can be exponentially better than the best-known uniform convergence bound. |
| Researcher Affiliation | Collaboration | Maria-Florina Balcan,1 Tuomas Sandholm,1,2,3,4 Ellen Vitercik1 1Computer Science Department, Carnegie Mellon University, 5000 Forbes Avenue, Pittsburgh, PA 15213 2Optimized Markets, Inc., Pittsburgh, PA 15213, USA 3Strategic Machine, Inc., Pittsburgh, PA 15213, USA 4Strategy Robot, Inc., Pittsburgh, PA 15213, USA |
| Pseudocode | Yes | Algorithm 1 Algorithm for learning (ϵ, δ)-optimal subsets |
| Open Source Code | No | The paper does not provide any explicit statement or link for open-source code. |
| Open Datasets | No | The paper discusses problem instances and distributions over them (e.g., integer programs, clustering instances) but does not specify any publicly available datasets by name with access information (link, citation, or repository). |
| Dataset Splits | No | The paper is theoretical and focuses on algorithm design and proofs of guarantees (e.g., sample complexity, termination), not empirical validation with dataset splits. |
| Hardware Specification | No | The paper is theoretical and does not describe any specific hardware used for experiments. |
| Software Dependencies | No | The paper is theoretical and does not describe any specific software dependencies with version numbers for experiments. |
| Experiment Setup | No | The paper is theoretical and focuses on algorithm design and proofs, not on empirical experimental setup details like hyperparameters or training configurations. |