DevFormer: A Symmetric Transformer for Context-Aware Device Placement

Authors: Haeyeon Kim, Minsu Kim, Federico Berto, Joungho Kim, Jinkyoo Park

ICML 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental In this paper, we present DEVFORMER, a novel transformer-based architecture for addressing the complex and computationally demanding problem of hardware design optimization. We apply DEVFORMER to the problem of decoupling capacitor placement and show that it outperforms state-of-the-art methods in both simulated and real hardware, leading to improved performances while reducing the number of components by more than 30%. Finally, we show that our approach achieves promising results in other offline contextual learning-based combinatorial optimization tasks.
Researcher Affiliation Academia 1Department of Electrical Engineering, Korea Advanced Institute of Science and Technology (KAIST) 2Department of Industrial and Systems Engineering, Korea Advanced Institute of Science and Technology (KAIST).
Pseudocode No The paper describes the DEVFORMER architecture and its components in text and diagrams (e.g., Figure 3), but it does not include pseudocode or a clearly labeled algorithm block.
Open Source Code Yes As a means of promoting transparency and reproducibility, we make the source codes of our method and the baselines discussed in this paper publicly available online1 as well as an accompanying interactive demonstration program2 to facilitate engagement and experimentation. 1https://github.com/kaist-silab/devformer 2https://dppbench.streamlit.app
Open Datasets Yes We use N expert data, collected by a genetic algorithm (GA) with a specific number of simulations M = 100 per each. For the test dataset of performance evaluation, 100 PDN cases are used. See Appendix A.2 and Appendix A.5 for detailed data construction. ... We generated 100 test problems and 100 validation problems for 10x10 PDN and 50 test problems and 50 validation problems for 15x15 PDN.
Dataset Splits Yes We generated 100 test problems and 100 validation problems for 10x10 PDN and 50 test problems and 50 validation problems for 15x15 PDN.
Hardware Specification Yes Simulation time was evaluated using the same PDN model on a machine equipped with a 40 threads Intel Xeon Gold 6226R CPU and 512GB of RAM.
Software Dependencies No The paper mentions that the objective function was implemented with an 'approximated modeling of electrical components as the fast Python simulator'. However, it does not provide specific version numbers for Python or any other libraries or software dependencies used in the experiments.
Experiment Setup Yes For Dev Former, we use encoder layers of L = 3 and 128 hidden dimensions of MHA, and 512 hidden dimensions for feed-forward. See Appendix C.1 for a detailed setup of training hyperparameters. (Appendix C.1, Table 3: Hyperparameter setting for training model: learning rate 10^-5, λ 10^32, N 2000, P 4, B 100).