Nonparametric Boundary Geometry in Physics Informed Deep Learning

Authors: Scott Cameron, Arnu Pretorius, S Roberts

NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental For boundary geometry training data, we use a collection of meshes created in Fusion 360 [21]. After preprocessing and filtering, our data set contains about 12 thousand triangular meshes. For validation data, we use an FEM solver on a small collection of meshes, which we treat as our ground truth. While FEM solvers are only approximate methods, in the cases we consider they are extremely accurate and so their errors are negligible. We measure the mean squared error (MSE) of our neural operator approximation compared to the FEM solution and report these errors as percentages of the mean square value (the average intensity) of the FEM solution.
Researcher Affiliation Collaboration Scott Cameron Oxford University, Instadeep Ltd. United Kingdom scameron@instadeep.com Arnu Pretorius Instadeep Ltd. South Africa Stephen Roberts Oxford University United Kingdom
Pseudocode No The paper describes the architecture and training process in prose but does not provide structured pseudocode or algorithm blocks.
Open Source Code No The paper does not provide an explicit statement or link for the open-sourcing of the code for the described methodology.
Open Datasets Yes For boundary geometry training data, we use a collection of meshes created in Fusion 360 [21]. After preprocessing and filtering, our data set contains about 12 thousand triangular meshes. ... Footnote 6: Available at https://github.com/Autodesk AILab/Fusion360Gallery Dataset.
Dataset Splits No The paper mentions using 'validation data' and 'validation set' but does not provide specific split percentages, sample counts, or detailed methodology for creating these splits.
Hardware Specification Yes All experiments use the same network architecture and hyper-parameters. They were trained on an RTX 3070 with 8Gb of VRAM.
Software Dependencies No The paper mentions various software components and optimizers (e.g., Fusion 360, Mesh CNN, Transformer, Adam, RAdam, RMSProp) but does not provide specific version numbers for these dependencies.
Experiment Setup Yes We use stochastic gradient descent with momentum as we found other optimizers (Adam [5], RAdam [8] and RMSProp) to be very unstable. We use a cosine annealing learning rate schedule with a linear warmup to a maximum learning rate of 0.001. To improve stability we use gradient clipping and gradient accumulation to increase the effective batch size.