Fair Allocation of Indivisible Chores: Beyond Additive Costs

Authors: Bo Li, Fangxiao Wang, Yu Zhou

NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical In this work, we first prove that no algorithm can ensure better than min{n, log m log log m}-approximation if the cost functions are submodular. This result also shows a sharp contrast with the allocation of goods where constant approximations exist as shown by Barman and Krishnamurthy [TEAC, 2020] and Ghodsi et al. [AIJ, 2022]. We then prove that for subadditive costs, there always exists an allocation that is min{n, log m }-approximation, and thus the approximation ratio is asymptotically tight.
Researcher Affiliation Academia 1Department of Computing, The Hong Kong Polytechnic University, Hong Kong, China comp-bo.li@polyu.edu.hk, fangxiao.wang@connect.polyu.hk, csyzhou@comp.polyu.edu.hk
Pseudocode Yes Algorithm 1 Computing a min{n, log m }-MMS allocation for subadditive costs; Algorithm 2 Partitioning the items into d bundles.; Algorithm 3 Computing 2-MMS allocations for the bin packing setting; Algorithm 4 IDO reduction for the bin packing and job scheduling settings; Algorithm 5 Partitioning the items into d bundles; Algorithm 6 Imaginary assignment; Algorithm 7 Allocating the items to the agents.
Open Source Code No The paper does not mention any open-source code or provide links to a code repository for its methodology.
Open Datasets No The paper is theoretical and does not use datasets for training or evaluation.
Dataset Splits No The paper is theoretical and does not describe dataset splits for training, validation, or testing.
Hardware Specification No The paper is theoretical and does not describe any hardware specifications used for experiments.
Software Dependencies No The paper is theoretical and does not mention specific software dependencies with version numbers for experimental reproducibility.
Experiment Setup No The paper is theoretical and does not describe any experimental setup details such as hyperparameters or training settings.