The Power of Optimization from Samples
Authors: Eric Balkanski, Aviad Rubinstein, Yaron Singer
NeurIPS 2016 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We also perform experiments on synthetic hard instances of monotone submodular functions that convey some interpretation of our results. |
| Researcher Affiliation | Academia | Eric Balkanski Harvard University ericbalkanski@g.harvard.edu Aviad Rubinstein University of California, Berkeley aviad@eecs.berkeley.edu Yaron Singer Harvard University yaron@seas.harvard.edu |
| Pseudocode | Yes | Algorithm 1 A tight (1 c)/(1 + c c2) o(1)-optimization from samples algorithm for monotone submodular functions with curvature c |
| Open Source Code | No | The paper does not provide any specific links or statements indicating that the source code for their methodology is publicly available. |
| Open Datasets | No | The paper refers to using "synthetic functions" in its experiments but does not mention specific, publicly available datasets or provide access information for any data used. |
| Dataset Splits | No | The paper discusses simulations on "synthetic functions" but does not provide specific details on how data was split for training, validation, or testing (e.g., percentages, sample counts, or cross-validation schemes). |
| Hardware Specification | No | The paper describes the algorithms and theoretical results, and mentions performing "simulations," but it does not specify any hardware used for these simulations (e.g., CPU, GPU models, memory, etc.). |
| Software Dependencies | No | The paper describes the algorithm and theoretical analysis but does not provide any specific software dependencies with version numbers (e.g., programming languages, libraries, or frameworks). |
| Experiment Setup | No | The paper describes the proposed algorithm (Algorithm 1) and mentions conducting "simulations on simple synthetic functions," but it does not provide specific experimental setup details such as hyperparameters, learning rates, batch sizes, or other training configurations. |