Structural Inference of Dynamical Systems with Conjoined State Space Models

Authors: Aoran Wang, Jun Pang

NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Extensive evaluations on sixteen diverse datasets demonstrate that SICSM outperforms existing methods, particularly in scenarios characterized by irregular sampling and incomplete observations, which highlight its potential as a reliable tool for scientific discovery and system diagnostics in disciplines that demand precise modeling of complex interactions.
Researcher Affiliation Academia Aoran Wang 1 & Jun Pang 1,2 1 Faculty of Science, Technology and Medicine, University of Luxembourg 2 Institute for Advanced Studies, University of Luxembourg {aoran.wang, jun.pang}@uni.lu
Pseudocode Yes The general training pipeline of SICSM is presented in Algorithm 1.
Open Source Code Yes Implementation can be found at: https://github.com/wang422003/SICSM-JAX/.
Open Datasets Yes Our study first evaluates the SICSM model on two established structural inference datasets: the Spring Simulations dataset [31], which simulates dynamic interactions of balls connected by springs within a symmetric setting, and the Net Sim dataset [47]... Additionally, we examined six directed synthetic biological networks (Linear, Linear Long, Cycle, Bifurcating, Trifurcating, and Bifurcating Converging) as outlined in [44]... We also incorporated data from the Struct Infer Benchmark [3]...
Dataset Splits Yes We collect the trajectories and randomly group them into three sets for training, validation and testing with the ratio of 8: 2: 2, respectively.
Hardware Specification Yes All experiments were conducted on a single NVIDIA Ampere 40GB HBM graphics card, paired with 2 AMD Rome CPUs (32 cores@2.35 GHz).
Software Dependencies No SICSM is implemented with JAX [9], including following packages: dm-haiku [24] and jraph [19]. The implementation of Residual Blocks follows the script of mamba-minimal-jax (https://github.com/radarFudan/mamba-minimal-jax/tree/main). And the implementation of GFN of SICSM follows the implementation of JSP-GFN [18]... SICSM is trained with Adam [29] optimizer, with the learning rate as 0.00001 and for 1000 epochs.
Experiment Setup Yes SICSM is trained with Adam [29] optimizer, with the learning rate as 0.00001 and for 1000 epochs. The exact number of layers actually depends on the number of nodes in the graph, and we report the values of L in Table 2.