Diffusion Models Beat GANs on Topology Optimization

Authors: François Mazé, Faez Ahmed

AAAI 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Compared to a state-of-the-art conditional GAN, our approach reduces the average error on physical performance by a factor of eight and produces eleven times fewer infeasible samples. Our work demonstrates the potential of using diffusion models in topology optimization and suggests a general framework for solving engineering optimization problems using external performance with constraint-aware guidance. 4 Empirical Evaluation We created three datasets to train the proposed models, which are made publicly available.
Researcher Affiliation Academia Franc ois Maz e1, Faez Ahmed1 1Massachusetts Institute of Technology francois.maze@etu.minesparis.psl.eu, {fmaze,faez}@mit.edu
Pseudocode Yes Algorithm 1 Regressor guidance for TO, given a conditional diffusion model (µθ(xt|xt+1, v, f, l), Σθ(xt|xt+1, v, f, l)) and a regressor cϕ(xt, v, f, l, bc). Algorithm 2 Guidance strategy for TO using Conditional Diffusion Model.
Open Source Code Yes We provide access to our data, code, and trained models at the following link: https://decode.mit.edu/projects/topodiff/.
Open Datasets Yes We created three datasets to train the proposed models, which are made publicly available. We provide access to our data, code, and trained models at the following link: https://decode.mit.edu/projects/topodiff/.
Dataset Splits Yes The main dataset is divided into training, validation, and testing as follows: ... 2. The validation data consist of 200 new combinations of constraints containing the same 42 boundary conditions;
Hardware Specification No The paper does not provide specific details regarding the hardware (e.g., CPU, GPU models, or cloud instance types) used for running the experiments.
Software Dependencies No The paper mentions software like 'SIMP-based TO library To Py (Hunter et al. 2017)' and 'Solids Py: 2D-Finite Element Analysis with Python (Guar ın-Zapata and G omez 2020)' but does not provide specific version numbers for any software dependencies.
Experiment Setup No The paper mentions 'hyperparameter tuning' and discusses 'gradient scale hyperparameters' (λc and λfm) in Section 3.4 and 4.3, and notes that a grid search was used for tuning, but it does not provide the specific numerical values of these hyperparameters or other training configurations such as learning rate or batch size.