Context Attentive Bandits: Contextual Bandit with Restricted Context
Authors: Djallel Bouneffouf, Irina Rish, Guillermo Cecchi, Raphaël Féraud
IJCAI 2017 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Our empirical results demonstrate advantages of the proposed approaches on several real-life datasets. |
| Researcher Affiliation | Industry | 1,2,3IBM Thomas J. Watson Research Center, Yorktown Heights, NY USA 4Orange Labs, 2 av. Pierre Marzin, 22300 Lannion (France) |
| Pseudocode | Yes | Algorithm 1 The CBRC Problem Setting, Algorithm 2 Thompson Sampling with Restricted Context (TSRC) |
| Open Source Code | No | The paper does not provide explicit statements or links to open-source code for the described methodology. |
| Open Datasets | Yes | Empirical evaluation of the proposed methods was based on four datasets from the UCI Machine Learning Repository 2: Covertype, CNAE-9, Internet Advertisements and Poker Hand (for details of each dataset, see Table 1). 2https://archive.ics.uci.edu/ml/datasets.html |
| Dataset Splits | No | The paper describes a sequential data stream simulation and online learning setup rather than explicit, fixed train/validation/test dataset splits with percentages or counts. |
| Hardware Specification | No | The paper does not provide any specific hardware details (e.g., CPU, GPU models, or memory specifications) used for running the experiments. |
| Software Dependencies | No | The paper does not provide specific software dependencies with version numbers. |
| Experiment Setup | Yes | We ran the above algorithms and our proposed TSRC and WTSRC methods, in stationary and non-stationary settings, respectively, for different feature subset sizes, such as 5%, 25%, 50% and 75% of the total number of features. [...] In this setting, for each dataset, we run the experiments for 3,000,000 iterations, where we change the label of class at each 500,000 iteration to simulate the non-stationarity. |