Batched Thompson Sampling
Authors: Cem Kalkanli, Ayfer Ozgur
NeurIPS 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | 5 Experimental Setup, 6 Experiments and Results, We evaluate BT-OFUL and BT-UCB against the batch Thompson Sampling (BTS) algorithm by considering both synthetic datasets and real-world datasets. |
| Researcher Affiliation | Academia | Yuehan Liu, Dongdong Ge, Weiran Shen Peking University |
| Pseudocode | Yes | Figure 1: Algorithm 1: BT-TS with fixed batch size, Figure 2: Algorithm 2: BT-OFUL, Figure 3: Algorithm 3: BT-UCB |
| Open Source Code | No | The paper does not provide an explicit statement or link for the open-source code of their methodology. |
| Open Datasets | Yes | We evaluate BT-OFUL and BT-UCB against the batch Thompson Sampling (BTS) algorithm by considering both synthetic datasets and real-world datasets. For real-world datasets, we use the Yahoo! Front Page Today Module (FP.Today) dataset [24] and MovieLens 1M dataset. |
| Dataset Splits | No | For the Yahoo! Today Front Page dataset, we use the first 200,000 samples for training and the next 100,000 samples for testing. No explicit mention of a 'validation' split. |
| Hardware Specification | No | No specific hardware details (e.g., GPU/CPU models, memory amounts, or detailed computer specifications) are mentioned for running the experiments. |
| Software Dependencies | No | No specific software dependencies with version numbers are mentioned in the paper. |
| Experiment Setup | Yes | For the synthetic datasets, we use a fixed learning rate of 0.01 for all algorithms. For the Yahoo! Today Front Page dataset, we set the batch size K = 1000, λ = 1, and η = 0.01. Similar parameters are used for MovieLens 1M dataset: batch size K=1000, λ = 1, η = 0.01. |