Learning to Simulate and Design for Structural Engineering

Authors: Kai-Hung Chang, Chin-Yi Cheng

ICML 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental The performance of the proposed structural designs is comparable to the ones optimized by genetic algorithm (GA), with all the constraints satisfied. Our results show that Neural Sizer can produce design comparable to the design generated by genetic algorithm running for 1000 iterations. We also perform experiments including ablation, extrapolation, and user study. We split the total 4000 data into 3200, 400, and 400 for training, evaluation, and testing purposes. All training and testing run on a Quadro M6000 GPU.
Researcher Affiliation Industry 1Autodesk Research, San Francisco, California, United States.
Pseudocode No The paper describes the network structures and training processes but does not provide any explicitly labeled pseudocode or algorithm blocks.
Open Source Code No The paper does not provide any explicit statements about open-sourcing its code or links to a code repository.
Open Datasets No Due to the lack of real structural design data, we synthesize a dataset that contains building skeletons with randomly sampled cross-sections in real-world scale. We also use Autodesk Robot Structural Simulator (RSA), a simulation software widely used in the industry, to compute the structural simulation results for the synthetic dataset. Please refer to the supplementary material for more details. The paper describes the creation of a synthetic dataset but does not provide concrete access information (link, DOI, or explicit public availability statement) for it.
Dataset Splits Yes We split the total 4000 data into 3200, 400, and 400 for training, evaluation, and testing purposes.
Hardware Specification Yes All training and testing run on a Quadro M6000 GPU.
Software Dependencies No The paper mentions using "Adam Optimizer" and "Autodesk Robot Structural Simulator (RSA)", but does not provide specific version numbers for any software dependencies.
Experiment Setup Yes Adam Optimizer is used with learning rate 1e-4 and weight decay 5e-4. Batch size is set to 1 and the number of epoch is 5. Neural Sizer updates based on the back-propagation gradients once every 5 epochs, and runs 50,000 epochs for training.