Generalization in Portfolio-Based Algorithm Selection
Authors: Maria-Florina Balcan, Tuomas Sandholm, Ellen Vitercik12225-12232
AAAI 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We provide experiments that illustrate the tradeoff we investigated from a theoretical perspective in the previous sections: as we increase the portfolio size, we can hope to include a well-suited parameter setting for any problem instance, but it becomes increasingly difficult to avoid overfitting. We illustrate this in the context of integer programming algorithm configuration. |
| Researcher Affiliation | Collaboration | 1Computer Science Department, Carnegie Mellon University, 5000 Forbes Avenue, Pittsburgh, PA 15213 2Optimized Markets, Inc., Pittsburgh, PA 15213, USA 3Strategic Machine, Inc., Pittsburgh, PA 15213, USA 4Strategy Robot, Inc., Pittsburgh, PA 15213, USA |
| Pseudocode | No | The paper describes procedures and algorithms in prose but does not include any clearly labeled pseudocode blocks or algorithm listings. |
| Open Source Code | No | The paper mentions using "Python s default scikit-learn regression forest" but does not state that the authors' own code or methodology is open-source or provide a link to it. |
| Open Datasets | Yes | We analyze a distribution over IPs formulating the combinatorial auction winner determination problem, which we generate using the Combinatorial Auction Test Suite (Leyton-Brown, Pearson, and Shoham 2000). |
| Dataset Splits | Yes | We draw a training set of M = 1000 IPs z1, . . . , z M D and solve for the dual functions u z1, . . . , u z M which measure tree size as a function of the parameter ρ using the algorithm described in Appendix D.1 of the paper by Balcan et al. (2018). ... We draw a training set z1, . . . , z N D of IPs (with N specified below). ... We draw Nt = 104 test instances St DNt. |
| Hardware Specification | Yes | All experiments were run on a 64-core machine with 512 GB of RAM, a m4.16xlarge Amazon AWS instance, and a cluster of m4.xlarge Amazon AWS instances. |
| Software Dependencies | Yes | We override the default variable selection of CPLEX 12.8.0.0 using the C API. ... We use Python s default scikit-learn regression forest (Pedregosa et al. 2011). |
| Experiment Setup | Yes | We configure CPLEX, one of the most widely used commercial solvers. ... We tune a parameter ρ [0, 1] of CPLEX that controls its variable selection policy... We learn regression forests F1, . . . , F10 for each of the 10 parameter settings in the portfolio ˆP. ... We then train the forest Fi corresponding to the parameter setting ρi using the labeled training set {(z1, uz1 (ρi)) , . . . , (z N, uz N (ρi))}. We use Python s default scikit-learn regression forest (Pedregosa et al. 2011). |