Online Pandora’s Boxes and Bandits
Authors: Hossein Esfandiari, MohammadTaghi HajiAghayi, Brendan Lucier, Michael Mitzenmacher1885-1892
AAAI 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | Our results use a reduction-based framework where we separate the issues of the cost of acquiring information from the online decision process of which prizes to keep. Our work shows that in many scenarios, Pandora can achieve a good approximation to the best possible performance. Our main result is a reduction from this general class of problems, which we refer to as online Pandora s box problems, to the problem of finding threshold-based algorithms for the associated prophet inequality problems where all costs are zero. |
| Researcher Affiliation | Collaboration | 1Google Research, 2University of Maryland, 3Microsoft Research, 4Harvard University esfandiari@google.com, hajiagha@cs.umd.edu, brlucier@microsoft.com, michaelm@eecs.harvard.edu |
| Pseudocode | No | The paper describes algorithms and theoretical constructs but does not contain any structured pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not provide any statement or link indicating the availability of open-source code for the described methodology. |
| Open Datasets | No | The paper is theoretical and does not involve the use of datasets for training, hence no information about public availability is provided. |
| Dataset Splits | No | The paper is theoretical and does not involve experiments requiring training, validation, or test dataset splits. |
| Hardware Specification | No | The paper is theoretical and does not discuss the hardware used for experiments. |
| Software Dependencies | No | The paper is theoretical and does not mention specific software dependencies with version numbers for implementation or experiments. |
| Experiment Setup | No | The paper is theoretical and does not describe an experimental setup, hyperparameters, or system-level training settings. |