Constrained Market Share Maximization by Signal-Guided Optimization

Authors: Bo Hui, Yuchen Fang, Tian Xia, Sarp Aykent, Wei-Shinn Ku

AAAI 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental In the experiment, we empirically verify the superiority of both our prediction model and optimization approach over existing works with large-scale real-world data.
Researcher Affiliation Academia 1 Auburn University 2 Beijing University of Posts and Telecommunications
Pseudocode Yes Algorithm 1: Signal-guided two-timescale Optimization
Open Source Code Yes Our code has been released at: https://github.com/coding And BS/Airline Market.
Open Datasets Yes We conduct extensive experiments on the airline origin and destination survey (DB1B) dataset1 released by the US Department of Transportation s Bureau of Transportation Statistics (BTS), which records the flight route information in the US and releases 10% of tickets sold in the US every month of the year for research purposes. 1https://www.transtats.bts.gov/Data Index.asp
Dataset Splits Yes From the prediction perspective, we aim to learn a neural network-based model by leveraging data of the top 3 airlines in the past ten years except for the last month s data. The data of the last month is used for testing. ... To prevent over-fitting for the data of last month, we use the 10-fold cross-validation method to choose the best hyper-parameter, which means we have nine 12-month folds and one 11-month fold. Following the standard cross-validation, we randomly select nine folds to train our model and utilize the remaining fold as the validation set to tune hyper-parameters and repeat this process nine times.
Hardware Specification No The paper does not specify any particular hardware details such as GPU models, CPU models, or memory used for running the experiments.
Software Dependencies No The paper mentions the use of 'Adam optimizer' but does not specify version numbers for any software dependencies or libraries.
Experiment Setup Yes We train our market share prediction model with the learning rate of 1e-3, epochs of 500, and the Adam optimizer for updating weights. The number of structure-aware attention layers in our model is 4 and the dimension of the hidden states in our model is 64.