Equity Promotion in Online Resource Allocation

Authors: Pan Xu, Yifan Xu9962-9970

AAAI 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We test our model and algorithms on publicly available COVID-19 vaccination datasets maintained by the Minnesota Department of Health. Experimental results confirm our theoretical predictions and demonstrate the power of our policies in navigating the distribution of limited resources toward the preset target ratios when compared against heuristics
Researcher Affiliation Academia Pan Xu1, Yifan Xu2 1 Department of Computer Science, New Jersey Institute of Technology, Newark, USA 2 Key Lab of CNII, MOE, Southeast University, Nanjing, China
Pseudocode Yes Algorithm 1: An LP-based sampling (SAMP).
Open Source Code No The paper does not provide any statement or link indicating the availability of open-source code for the described methodology.
Open Datasets Yes We use the publicly available COVID-19 vaccination datasets that are maintained by the Minnesota Department of Health4. 4https://mn.gov/covid19/vaccine/data/index.jsp.
Dataset Splits No The paper describes the construction of the input instance from the datasets but does not explicitly provide details about training, validation, or test dataset splits, percentages, or sample counts for reproducibility.
Hardware Specification No The paper does not provide any specific details about the hardware (e.g., GPU/CPU models, memory, or cloud computing resources) used to run the experiments.
Software Dependencies No The paper does not specify any software dependencies or versions (e.g., programming languages, libraries, frameworks, or solvers with version numbers) used for the experiments.
Experiment Setup Yes We test different settings when the supply scarcity ρ takes values in {1, 1.5, 2, 2.5, 3} while the minimum serving capacity is fixed at b = 1. For each setting, we run all algorithms for 100 times and take the average as the final performance.