Optimization of Smooth Functions with Noisy Observations: Local Minimax Rates
Authors: Yining Wang, Sivaraman Balakrishnan, Aarti Singh
NeurIPS 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | We propose a local minimax framework to study the fundamental difficulty of optimizing smooth functions with adaptive function evaluations. Our main results are to characterize the local convergence rates Rnpf0q for a wide range of reference functions f0 P F. We prove local minimax lower bounds that match the n α{p2α d αβq upper bound, up to logarithmic factors in n. |
| Researcher Affiliation | Academia | Yining Wang, Sivaraman Balakrishnan, Aarti Singh Department of Machine Learning and Statistics Carnegie Mellon University, Pittsburgh, PA, 15213, USA {yiningwa,aarti}@cs.cmu.edu, siva@stat.cmu.edu |
| Pseudocode | No | The paper includes a conceptual illustration of an algorithm in Figure 1 and describes its procedure in Section A. However, it does not present a formal, structured pseudocode block labeled "Algorithm" or "Pseudocode". |
| Open Source Code | No | The paper does not provide any statement about releasing source code or a link to a code repository for the methodology described. |
| Open Datasets | No | The paper is theoretical and models a system (Eq. 1) but does not use real-world datasets for training or experimentation. Therefore, no information about publicly available datasets is provided. |
| Dataset Splits | No | The paper is theoretical and does not involve empirical experiments with datasets. Therefore, no information regarding training, validation, or test splits is provided. |
| Hardware Specification | No | The paper is theoretical and does not describe any computational experiments. Therefore, no hardware specifications are mentioned. |
| Software Dependencies | No | The paper is theoretical and does not mention any specific software components or their versions required for reproduction. |
| Experiment Setup | No | The paper is theoretical and focuses on mathematical derivations and bounds. It does not describe any empirical experimental setup, hyperparameters, or training configurations. |