Towards LLM4QPE: Unsupervised Pretraining of Quantum Property Estimation and A Benchmark
Authors: Yehui Tang, Hao Xiong, Nianzu Yang, Tailong Xiao, Junchi Yan
ICLR 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Extensive experiments show the promising efficacy of LLM4QPE in various tasks including classifying quantum phases of matter on Rydberg atom model and predicting two-body correlation function on anisotropic Heisenberg model. |
| Researcher Affiliation | Academia | Yehui Tang1, Hao Xiong1, Nianzu Yang1, Tailong Xiao23, Junchi Yan1 1Department of Computer Science and Engineering, Shanghai Jiao Tong University 2Institute of Quantum Sensing and Information Processing, Shanghai Jiao Tong University 3Hefei National Laboratory, Hefei, China |
| Pseudocode | No | The paper describes the model architecture and processes (e.g., pretraining and finetuning steps) using figures and explanatory text, but it does not include any formally structured pseudocode blocks or algorithm listings labeled as such. |
| Open Source Code | No | The code to train the model and analyze the experimental results is available from the first author on reasonable request. |
| Open Datasets | Yes | The generated quantum data of the Rydberg atom model and the anisotropic Heisenberg model is available at https://github.com/abel1231/qpe-data. |
| Dataset Splits | No | The paper states, 'Then we split Df to construct train/test dataset Dt/De.' and 'We further split the Df into Dt and De for training and evaluation respectively with varied separation ratio.' It specifies train and test sets but does not provide explicit details for a validation split (e.g., percentages, sample counts, or methodology for the split). |
| Hardware Specification | No | The paper mentions 'given limited measurements on a resource-limited device' but does not provide specific details about the hardware used for the experiments, such as GPU/CPU models, memory, or specific computing environments. |
| Software Dependencies | No | The paper mentions software tools like 'Bloqade.jl (blo, 2023)', 'scipy (Virtanen et al., 2020)', and 'pennylane (Bergholm et al., 2018) toolbox'. However, it does not provide specific version numbers for these or other software dependencies used in the experiments, which are necessary for reproducible descriptions. |
| Experiment Setup | Yes | For both the Rydberg atom model and the anisotropic Heisenberg model, we fix Np = 100 and Kp = 1024. For each training iteration, we randomly sample Bp rows of Ein and Cin. Such that the input of the model is {(σb, cb)|σb Ein, cb Cin}Bp b=1 with batch size Bp. The pretrained parameters are transferred to finetune the model using Dt, where the number of sampled physical conditions Nt {25, 64, 100} and the number of measurement strings Kf {64, 128, 256, 512, 1024}. |