Position: Optimization in SciML Should Employ the Function Space Geometry

Authors: Johannes Müller, Marius Zeinhofer

ICML 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental An illustration of the importance of the infinite-dimensional perspective is provided in Figure 1, which demonstrates that respecting the function space geometry can result in orders of magnitude improvement. Training curves for a PINN for a 2-dimensional Poisson equation; the first order optimizers (Adam and Gradient Descent) plateau, the second-order optimizers (ENGD and Newton) perform much better, but the function-space inspired optimizer (ENGD) reaches the highest accuracy by several orders of magnitude.
Researcher Affiliation Collaboration 1Chair of Mathematics of Information Processing, RWTH Aachen University, Aachen, Germany 2Simula Research Laboratory, Oslo, Norway.
Pseudocode No The paper does not contain any pseudocode or clearly labeled algorithm blocks.
Open Source Code No The paper does not provide any concrete access to source code for the methodology described.
Open Datasets No The paper states that for PINNs,
Dataset Splits No The paper discusses training but does not provide specific dataset split information (e.g., percentages or counts for training, validation, or test sets).
Hardware Specification No The paper mentions
Software Dependencies No The paper does not provide specific ancillary software details with version numbers (e.g., library or solver names with version numbers) needed to replicate the experiment.
Experiment Setup No The paper mentions different optimizers (Adam, Gradient Descent, ENGD, Newton) in the context of Figure 1 but does not provide specific hyperparameter values or detailed training configurations (e.g., learning rate, batch size, number of epochs) for the experiments.