On Robust Concepts and Small Neural Nets
Authors: Amit Deshpande, Sushrut Karmalkar
ICLR 2017 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | We prove that any noise-stable boolean function on n boolean-valued input variables can be well-approximated by a two-layer linear threshold circuit with a small number of hidden-layer nodes and small weights, that depend only on the noise-stability and approximation parameters, and are independent of n. We also give a polynomial time learning algorithm that outputs a small two-layer linear threshold circuit that approximates such a given function. The universal approximation theorem of Hornik et al. (1989) and Cybenko (1992) provides a foundation to the mathematical theory of artiļ¬cial neural networks. |
| Researcher Affiliation | Collaboration | Amit Deshpande Microsoft Research, Vigyan, 9 Lavelle Road, Bengaluru 560001, India amitdesh@microsoft.com Sushrut Karmalkar Department of Computer Science, The University of Texas at Austin, 2317 Speedway, Stop D9500 Austin, TX 78712, USA sushrutk@cs.utexas.edu |
| Pseudocode | No | The paper does not contain any structured pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not provide any concrete access to source code for the methodology described. |
| Open Datasets | No | This is a theoretical paper and does not mention specific datasets used for training or public availability of any dataset. |
| Dataset Splits | No | This is a theoretical paper and does not provide specific dataset split information. |
| Hardware Specification | No | The paper does not provide any specific hardware details used for running experiments. |
| Software Dependencies | No | The paper does not provide specific ancillary software details with version numbers. |
| Experiment Setup | No | The paper is theoretical and does not include specific experimental setup details, hyperparameters, or training configurations. |