PGODE: Towards High-quality System Dynamics Modeling
Authors: Xiao Luo, Yiyang Gu, Huiyu Jiang, Hang Zhou, Jinsheng Huang, Wei Ju, Zhiping Xiao, Ming Zhang, Yizhou Sun
ICML 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Extensive experiments in both in-distribution and out-of-distribution settings validate the superiority of PGODE compared to various baselines. |
| Researcher Affiliation | Academia | 1Department of Computer Science, University of California, Los Angeles, USA 2National Key Laboratory for Multimedia Information Processing, School of Computer Science, Peking University 3Department of Statistics and Applied Probability, University of California, Santa Barbara, USA 4Department of Statistics, University of California, Davis, USA. |
| Pseudocode | Yes | We summarize the whole algorithm in Appendix A. Algorithm 1 Training Algorithm of PGODE |
| Open Source Code | No | The paper does not provide an explicit statement about releasing its source code or a direct link to a code repository for its methodology. |
| Open Datasets | Yes | The two physical dynamic simulation datasets Springs and Charged are commonly used in the field of machine learning for simulating physical systems. [...] We construct two molecular dynamics datasets using two proteins, i.e., 5AWL, 2N5C, and our approach is evaluated on the two datasets. [...] 5AWL and 2N5C, which can be accessed from the RCSB1. https://www.rcsb.org |
| Dataset Splits | Yes | For the physical dynamic datasets, we generate 1200 samples for training and validating, 200 samples for ID testing and 200 samples for OOD testing. For the molecular dynamic datasets, we construct 200 samples for training, 50 samples for validating, 50 samples for ID testing and 50 samples for testing in OOD settings. |
| Hardware Specification | Yes | All these experiments in this work are performed on a single NVIDIA A40 GPU. |
| Software Dependencies | Yes | We leverage Py Torch (Paszke et al., 2017) and torchdiffeq package (Kidger et al., 2021) to implement all the compared approaches and our PGODE. |
| Experiment Setup | Yes | The number of prototypes is set to 5 as default. For optimization, we utilize an Adam optimizer (Kingma & Ba, 2015) with an initial learning rate of 0.0005. The batch size is set to 256 for the physical dynamic simulation datasets and 64 for the molecular dynamic simulation datasets. |