A closer look at the approximation capabilities of neural networks
Authors: Kai Fong Ernest Chong
ICLR 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | In this paper, we give a direct algebraic proof of the theorem. Furthermore we shall explicitly quantify the number of hidden units required for approximation. |
| Researcher Affiliation | Academia | Kai Fong Ernest Chong Information Systems Technology and Design (ISTD) pillar, Singapore University of Technology and Design, Singapore ernest chong@sutd.edu.sg |
| Pseudocode | No | The paper is theoretical and does not present any algorithms or procedures in pseudocode format. |
| Open Source Code | No | The paper is theoretical and does not present any new software or code for release. |
| Open Datasets | No | The paper is theoretical and does not involve training models on datasets. |
| Dataset Splits | No | The paper is theoretical and does not involve dataset splits for validation. |
| Hardware Specification | No | The paper is theoretical and does not involve computational experiments requiring specific hardware. |
| Software Dependencies | No | The paper is theoretical and does not involve computational experiments requiring specific software dependencies. |
| Experiment Setup | No | The paper is theoretical and does not involve an experimental setup. |