Computational Separations between Sampling and Optimization
Authors: Kunal Talwar
NeurIPS 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | We present a simpler and stronger separation. We then compare sampling and optimization in more detail and show that they are provably incomparable: there are families of continuous functions for which optimization is easy but sampling is NP-hard, and vice versa. Further, we show function families that exhibit a sharp phase transition in the computational complexity of sampling, as one varies the natural temperature parameter. Our results draw on a connection to analogous separations in the discrete setting which are well-studied. |
| Researcher Affiliation | Industry | Kunal Talwar Google Brain Mountain View, CA kunal@google.com |
| Pseudocode | No | The paper describes algorithms but does not contain structured pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not provide concrete access to source code for the methodology described. |
| Open Datasets | No | The paper is theoretical and does not mention using or providing access to any empirical dataset for training. |
| Dataset Splits | No | The paper is theoretical and does not provide specific dataset split information for validation or training. |
| Hardware Specification | No | The paper is theoretical and does not provide specific hardware details used for running experiments. |
| Software Dependencies | No | The paper is theoretical and does not provide specific ancillary software details with version numbers. |
| Experiment Setup | No | The paper is theoretical and does not contain specific experimental setup details like hyperparameter values or training configurations. |