SQ Lower Bounds for Learning Single Neurons with Massart Noise
Authors: Ilias Diakonikolas, Daniel Kane, Lisheng Ren, Yuxin Sun
NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | Our work is pure theoretical in nature. For a range of activation functions, including Re LUs, we establish superpolynomial Statistical Query (SQ) lower bounds for this learning problem. |
| Researcher Affiliation | Academia | Ilias Diakonikolas University of Wisconsin-Madison ilias@cs.wisc.edu Daniel M. Kane University of California, San Diego dakane@cs.ucsd.edu Lisheng Ren University of Wisconsin-Madison lren29@wisc.edu Yuxin Sun University of Wisconsin-Madison yxsun@cs.wisc.edu |
| Pseudocode | No | The paper describes its constructions and mathematical procedures using prose and definitions but does not include any formal pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not provide any statement about releasing source code for its methodology, nor does it provide a link to a code repository. The ethics checklist marks N/A for code. |
| Open Datasets | No | The paper is theoretical and focuses on lower bounds for learning problems with statistical queries; it does not use or refer to any publicly available or open datasets for empirical training or evaluation. |
| Dataset Splits | No | The paper is theoretical and does not perform experiments that require dataset splits for training, validation, or testing. |
| Hardware Specification | No | The paper is theoretical and does not describe any specific hardware used for experiments. |
| Software Dependencies | No | The paper is theoretical and does not mention any specific software dependencies or version numbers. |
| Experiment Setup | No | The paper is theoretical and does not describe an experimental setup with hyperparameters or system-level training settings. |