Phase Transitions in the Pooled Data Problem
Authors: Jonathan Scarlett, Volkan Cevher
NeurIPS 2017 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | In the noiseless setting, we identify an exact asymptotic threshold on the required number of tests with optimal decoding, and prove a phase transition between complete success and complete failure. In addition, we present a novel noisy variation of the problem, and provide an information-theoretic framework for characterizing the required number of tests for general random noise models. Our results reveal that noise can make the problem considerably more difficult, with strict increases in the scaling laws even at low noise levels. Finally, we demonstrate similar behavior in an approximate recovery setting, where a given number of errors is allowed in the decoded labels. |
| Researcher Affiliation | Academia | Jonathan Scarlett and Volkan Cevher Laboratory for Information and Inference Systems (LIONS) École Polytechnique Fédérale de Lausanne (EPFL) {jonathan.scarlett,volkan.cevher}@epfl.ch |
| Pseudocode | No | The paper does not contain any pseudocode or algorithm blocks. |
| Open Source Code | No | The paper is theoretical and does not mention providing open-source code for the described methodology. |
| Open Datasets | No | The paper is theoretical and does not involve the use of datasets for training. |
| Dataset Splits | No | The paper is theoretical and does not involve dataset splits for validation. |
| Hardware Specification | No | The paper is purely theoretical and does not describe any experimental setup or mention specific hardware used. |
| Software Dependencies | No | The paper is purely theoretical and does not mention any specific software dependencies with version numbers. |
| Experiment Setup | No | The paper is purely theoretical and does not describe any experimental setup, hyperparameters, or training configurations. |