On the Pseudo-Dimension of Nearly Optimal Auctions

Authors: Jamie H. Morgenstern, Tim Roughgarden

NeurIPS 2015 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical This paper develops a general approach, rooted in statistical learning theory, to learning an approximately revenue-maximizing auction from data. We introduce t-level auctions to interpolate between simple auctions, such as welfare maximization with reserve prices, and optimal auctions, thereby balancing the competing demands of expressivity and simplicity. We prove that such auctions have small representation error, in the sense that for every product distribution F over bidders valuations, there exists a t-level auction with small t and expected revenue close to optimal. We show that the set of t-level auctions has modest pseudodimension (for polynomial t) and therefore leads to small learning error. One consequence of our results is that, in arbitrary single-parameter settings, one can learn a mechanism with expected revenue arbitrarily close to optimal from a polynomial number of samples.
Researcher Affiliation Academia Jamie Morgenstern Computer and Information Science University of Pennsylvania Philadelphia, PA jamiemor@cis.upenn.edu Tim Roughgarden Stanford University Palo Alto, CA tim@cs.stanford.edu
Pseudocode No The paper describes an allocation rule and payment rule in prose, but does not present them in a structured pseudocode or algorithm block format.
Open Source Code No The paper does not mention providing open-source code for the described methodology.
Open Datasets No The paper discusses learning from 'i.i.d. samples from F' but does not specify any public dataset used or provide access information for data.
Dataset Splits No The paper is theoretical and does not describe empirical experiments or dataset splits.
Hardware Specification No The paper is theoretical and does not describe any computational experiments that would require specific hardware specifications.
Software Dependencies No The paper is theoretical and does not describe any computational experiments that would require specific software dependencies with version numbers.
Experiment Setup No The paper is theoretical and does not describe any experimental setup details such as hyperparameters or training settings.