The Adaptive Complexity of Maximizing a Gross Substitutes Valuation

Authors: Ron Kupfer, Sharon Qian, Eric Balkanski, Yaron Singer

NeurIPS 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Additionally, we conduct experiments on synthetic and real data sets to demonstrate the near-optimal performance and efficiency of the algorithm in practice.
Researcher Affiliation Academia Ron Kupfer The Hebrew University of Jerudalem ron.kupfer@mail.huji.ac.il Eric Balkanski Harvard University ericbalkanski@g.harvard.edu Sharon Qian Harvard University sharonqian@g.harvard.edu Yaron Singer Harvard University yaron@seas.harvard.edu
Pseudocode Yes Algorithm 1 IMPATIENT GREEDY
Open Source Code No The paper does not include an unambiguous statement about releasing source code for the described methodology, nor does it provide a direct link to a code repository.
Open Datasets No The paper describes constructing synthetic and Twitter graphs for experiments, detailing their generation process in sections like 'Synthetic graphs' and 'Twitter graphs', but it does not provide specific links, DOIs, or citations to publicly available versions of these datasets.
Dataset Splits No The paper does not explicitly provide details about training, validation, or test dataset splits. It describes generating and using datasets for evaluating the proposed algorithm, but without defining data partitions for training or hyperparameter tuning.
Hardware Specification No The paper does not explicitly describe the hardware used to run its experiments, such as specific GPU or CPU models, or cloud resources with detailed specifications.
Software Dependencies No The paper does not provide specific ancillary software details, such as library or solver names with version numbers, needed to replicate the experiment.
Experiment Setup Yes For all of the experiments we have used ϵ = 0.1. We select k = 100 elements on synthetic data and k = 150 on real data.