Notice: The reproducibility variables underlying each score are classified using an automated LLM-based pipeline, validated against a manually labeled dataset. LLM-based classification introduces uncertainty and potential bias; scores should be interpreted as estimates. Full accuracy metrics and methodology are described in [1].

Contextual Pricing for Lipschitz Buyers

Authors: Jieming Mao, Renato Leme, Jon Schneider

NeurIPS 2018 | Venue PDF | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical We investigate the problem of learning a Lipschitz function from binary feedback. ... For the symmetric loss ... we provide an algorithm for this problem achieving total loss O(log T) when d = 1 and O(T (d 1)/d) when d > 1, and show that both bounds are tight (up to a factor of log T). To prove these results, we investigate a more general problem, which we term learning a Lipschitz function with binary feedback, and which may be of independent interest.
Researcher Affiliation Collaboration Jieming Mao University of Pennsylvania EMAIL Renato Paes Leme Google Research EMAIL Jon Schneider Google Research EMAIL
Pseudocode Yes Algorithm 1 Algorithm for learning a L-Lipschitz function from R to R under symmetric loss with regret O(L log T). ... Algorithm 2 Midpoint algorithm for learning a L-Lipschitz function from R to R under symmetric loss with regret O(L log T).
Open Source Code No The paper does not provide any statement or link indicating the availability of open-source code for the described methodology.
Open Datasets No The paper focuses on theoretical algorithms and their bounds, and does not describe the use of any specific dataset for training, nor does it provide access information for a publicly available dataset.
Dataset Splits No The paper is theoretical and does not describe experimental validation on data, thus no information on training, validation, or test dataset splits is provided.
Hardware Specification No The paper is theoretical and focuses on algorithms and proofs; therefore, it does not provide any specific details regarding hardware used for experiments.
Software Dependencies No The paper does not provide specific version numbers for software components, libraries, or solvers, as it focuses on theoretical contributions rather than practical implementation details or experimental setups.
Experiment Setup No The paper is theoretical and describes algorithms and proofs rather than empirical experiments, so it does not provide specific experimental setup details such as hyperparameters or system-level training settings.