Low-Rank Bandit Methods for High-Dimensional Dynamic Pricing

Authors: Jonas W. Mueller, Vasilis Syrgkanis, Matt Taddy

NeurIPS 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental 5 Experiments We evaluate the performance of our methodology in settings where noisy demands are generated according to equation (2)... Our proposed algorithms are compared against the GDG online bandit algorithm of Flaxman et al. [2005]... Figures 1A and 1B show that our OPOK and OPOL algorithms are greatly superior to GDG...
Researcher Affiliation Collaboration Jonas Mueller MIT CSAIL jonasmueller@csail.mit.edu Vasilis Syrgkanis Microsoft Research vasy@microsoft.com Matt Taddy Chicago Booth taddy@chicagobooth.edu
Pseudocode Yes Algorithm 1 OPOK (Online Pricing Optimization with Known Features) ... Algorithm 2 FINDPRICE(x; U, S, pt 1) ... Algorithm 3 PROJECTION(x, , U, S) ... Algorithm 4 OPOL (Online Pricing Optimization with Latent Features)
Open Source Code No The paper does not contain any explicit statement about providing open-source code for the described methodology, nor does it provide a link to a code repository.
Open Datasets Yes Historical demand data obtained from: www.kaggle.com/c/grupo-bimbo-inventory-demand/
Dataset Splits No The paper describes performance evaluation of online algorithms and analysis of historical data, but does not provide details on specific training, validation, or test dataset splits.
Hardware Specification No The paper does not provide specific hardware details such as GPU or CPU models, or cloud instance types used for running the experiments.
Software Dependencies No The paper does not provide specific details on software dependencies with version numbers (e.g., programming languages, libraries, or solvers) used for implementing the described methods.
Experiment Setup Yes Throughout, pt and qt represent rescaled rather than absolute prices/demands, such that the feasible set S can be simply fixed as a centered sphere of radius r 20. Noise in the (rescaled) demands for each individual product is always sampled as: t Np0, 10q. Before each experiment, we sample the entries of z, V independently as zij Np100, 20q, Vij Np0, 2q, and U is fixed as a random sparse binary matrix...