Testably Learning Polynomial Threshold Functions
Authors: Lucas Slot, Stefan Tiegel, Manuel Wiedmer
NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | Our main result is that PTFs can be learned efficiently in the testable model as well. ... Theorem 3 is the first result achieving efficient testable learning for PTFs of any fixed degree d (up to constant error ε > 0). Previously, such a result was not even available for learning degree-2 PTFs with respect to the Gaussian distribution. |
| Researcher Affiliation | Academia | Lucas Slot Department of Computer Science ETH Zurich lucas.slot@inf.ethz.ch Stefan Tiegel Department of Computer Science ETH Zurich stefan.tiegel@inf.ethz.ch Manuel Wiedmer Department of Computer Science ETH Zurich manuel.wiedmer@inf.ethz.ch |
| Pseudocode | No | The paper does not contain any structured pseudocode or algorithm blocks. It describes theoretical concepts and proofs. |
| Open Source Code | No | The paper is theoretical and does not mention releasing any open-source code for the described methodology. |
| Open Datasets | No | The paper is theoretical and discusses properties of distributions like the standard Gaussian, but it does not use or provide access information for a public dataset for training purposes. |
| Dataset Splits | No | The paper is theoretical and does not include empirical experiments with dataset splits for training, validation, or testing. |
| Hardware Specification | No | The paper is theoretical and does not describe specific hardware used for running experiments. |
| Software Dependencies | No | The paper is theoretical and does not provide specific ancillary software details with version numbers. |
| Experiment Setup | No | The paper is theoretical and does not provide specific experimental setup details such as hyperparameter values or training configurations. |