Neural Network Approximations of PDEs Beyond Linearity: A Representational Perspective
Authors: Tanya Marwah, Zachary Chase Lipton, Jianfeng Lu, Andrej Risteski
ICML 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | However, most prior theoretical analyses have been limited to linear PDEs. In this work, we take a step towards studying the representational power of neural networks for approximating solutions to nonlinear PDEs. Our proof technique involves neurally simulating (preconditioned) gradient in an appropriate Hilbert space |
| Researcher Affiliation | Academia | 1Carnegie Mellon University 2Duke University. |
| Pseudocode | No | The paper describes mathematical algorithms and iterative processes but does not include a clearly labeled pseudocode or algorithm block. |
| Open Source Code | No | The paper is theoretical and does not mention the release of source code for the described methodology. |
| Open Datasets | No | The paper is theoretical and does not describe experiments using datasets. |
| Dataset Splits | No | The paper is theoretical and does not describe experiments or dataset splits for training or validation. |
| Hardware Specification | No | The paper is theoretical and does not mention any hardware specifications for running experiments. |
| Software Dependencies | No | The paper is theoretical and does not mention any specific software dependencies with version numbers. |
| Experiment Setup | No | The paper is theoretical and does not describe an experimental setup or hyperparameters. |