Cascading Contextual Assortment Bandits
Authors: Hyun-jun Choi, Rajan Udwani, Min-hwan Oh
NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We substantiate our theoretical claims with numerical experiments, demonstrating the practical efficacy of our proposed methods. |
| Researcher Affiliation | Academia | Hyun-jun Choi Seoul National University Rajan Udwani UC Berkeley Min-hwan Oh Seoul National University |
| Pseudocode | Yes | Algorithm 1 UCB-CCA |
| Open Source Code | No | The paper does not provide concrete access to source code for the methodology described in this paper. |
| Open Datasets | No | The paper describes generating synthetic data for simulations rather than using a publicly available dataset. It states: 'For simulations, we generate a random sample of the unknown time-invariant parameter from N(0, 1) at the beginning of the simulation. We sample N feature vectors from N(0, 1) in each round t.' |
| Dataset Splits | No | The paper focuses on online learning and regret in a simulation setting with synthetically generated data, and thus does not explicitly describe train/validation/test dataset splits or cross-validation methodology. |
| Hardware Specification | No | The paper does not provide specific hardware details such as exact GPU/CPU models, processor types, or memory amounts used for running its experiments. |
| Software Dependencies | No | The paper does not provide specific ancillary software details with version numbers (e.g., library or solver names with version numbers) needed to replicate the experiment. |
| Experiment Setup | No | The paper describes the general simulation setup, including data generation parameters (e.g., sampling from N(0,1)) and how the oracle computes cascades, but it does not specify concrete hyperparameters or system-level training configurations (e.g., specific values for the ridge penalty parameter λ used in experiments, learning rates, or other optimizer settings). |