Parametric Complexity Bounds for Approximating PDEs with Neural Networks
Authors: Tanya Marwah, Zachary Lipton, Andrej Risteski
NeurIPS 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | We prove that when a PDE s coefficients are representable by small neural networks, the parameters required to approximate its solution scale polynomially with the input dimension d and proportionally to the parameter counts of the coefficient networks. To this end, we develop a proof technique that simulates gradient descent (in an appropriate Hilbert space) by growing a neural network architecture whose iterates each participate as subnetworks in their (slightly larger) successors, and converge to the solution of the PDE. |
| Researcher Affiliation | Academia | Tanya Marwah, Zachary C. Lipton, Andrej Risteski Machine Learning Department, Carnegie Mellon University {tmarwah, zlipton, aristesk}@andrew.cmu.edu |
| Pseudocode | No | The paper describes a proof technique and theoretical algorithms (like simulating gradient descent) but does not contain a formal pseudocode block or algorithm listing. |
| Open Source Code | No | The paper is theoretical and does not mention releasing any source code. In the NeurIPS 2021 checklist, section '3. If you ran experiments...', point (a) 'Did you include the code...' is marked '[N/A]'. |
| Open Datasets | No | The paper is theoretical and does not involve empirical training on datasets. In the NeurIPS 2021 checklist, section '4. If you are using existing assets...', point (a) 'If your work uses existing assets, did you cite the creators?' is marked '[N/A]'. |
| Dataset Splits | No | The paper is theoretical and does not involve empirical evaluation or dataset splits. In the NeurIPS 2021 checklist, section '3. If you ran experiments...', point (b) 'Did you specify all the training details...' is marked '[N/A]'. |
| Hardware Specification | No | The paper is theoretical and does not describe running experiments, therefore no hardware specifications are mentioned. In the NeurIPS 2021 checklist, section '3. If you ran experiments...', point (d) 'Did you include the total amount of compute and the type of resources used...' is marked '[N/A]'. |
| Software Dependencies | No | The paper is theoretical and does not describe running experiments, therefore no software dependencies with version numbers are listed. |
| Experiment Setup | No | The paper is theoretical and does not describe running experiments or specific hyperparameters. In the NeurIPS 2021 checklist, section '3. If you ran experiments...', point (b) 'Did you specify all the training details...' is marked '[N/A]'. |