ChiPFormer: Transferable Chip Placement via Offline Decision Transformer

Authors: Yao Lai, Jinxin Liu, Zhentao Tang, Bin Wang, Jianye Hao, Ping Luo

ICML 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental extensive experiments on 32 chip circuits demonstrate that Chi PFormer achieves significantly better placement quality while reducing the runtime by 10 compared to recent state-of-the-art approaches in both public benchmarks and realistic industrial tasks.
Researcher Affiliation Collaboration 1 Department of Computer Science, The University of Hong Kong, Hong Kong 2 Shanghai AI Laboratory, China 3 Huawei Noah s Ark Lab, China 4 Zhejiang University, China 5 Tianjin University, China .
Pseudocode Yes The corresponding pseudo-code is shown in Appendix Algo. 1.
Open Source Code Yes The deliverables are released at sites.google.com/view/chipformer/home.
Open Datasets Yes To facilitate future research, we have released our collected offline dataset. ... The dataset is shared on Google drive.
Dataset Splits No The paper discusses training on offline data and finetuning on unseen circuits but does not specify typical train/validation/test dataset splits (e.g., percentages or sample counts) for a single dataset.
Hardware Specification Yes Table 11: Hyper-parameters used in our experiments ... computing hardware CPU AMD Ryzen 9 5950X GPU 2 RTX 3090
Software Dependencies No The paper mentions software like DREAMPlace, Mask Place, and GPT but does not provide specific version numbers for these or other ancillary software components.
Experiment Setup Yes Detailed model architecture and hyper-parameter settings are in Appendix A.5, Table 10 and 11. ... Table 11: Hyper-parameters used in our experiments