Parameter Inference with Bifurcation Diagrams

Authors: Gregory Szep, Neil Dalchau, Attila Csikász-Nagy

NeurIPS 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental In this section, we apply the method first to minimal examples that can produce saddle-node and pitchfork bifurcations (both N = 1, M = 2), and then a slightly more complex model (N = 2, M = 5) that has multiple parametric regimes producing saddle-node bifurcations. We also demonstrate our method on a model of greater complexity, to convince the reader that the method can be used on more realistic examples with practical significance. In Supplementary D we demonstrate the identification of saddle-node bifurcations and damped oscillations in a model (N = 4, M = 21) of a synthetic gene circuit in E. coli [3]. Experiments & Results
Researcher Affiliation Collaboration Gregory Szep King s College London London, WC2R 2LS, UK gregory.szep@kcl.ac.uk Attila Csikász-Nagy Pázmány Péter Catholic University Budapest, 1083, Hungary csikasznagy@gmail.com Neil Dalchau Microsoft Research Cambridge Cambridge, CB1 2FB, UK ndalchau@gmail.com
Pseudocode No The paper describes mathematical formulations and methods in text and equations but does not provide a structured pseudocode or algorithm block.
Open Source Code Yes Implementation of the method as a Julia package Bifurcation Inference.jl
Open Datasets No The paper focuses on parameter inference for differential equations using mathematical models (e.g., "minimal models," "genetic toggle switch") rather than explicitly using a traditional public dataset with access information. No specific datasets with links, DOIs, or formal citations are provided.
Dataset Splits No The paper does not provide specific dataset split information (e.g., percentages, sample counts, or citations to predefined splits) for training, validation, or testing.
Hardware Specification Yes Calculations were performed on an Intel Core i7-6700HQ CPU @ 2.60GHz x 8 without GPU acceleration.
Software Dependencies No The paper mentions software like "Flux.jl" and "Optim.jl" but does not specify their version numbers. While Julia is mentioned, its version is also not stated, nor are specific versions for any other libraries or solvers.
Experiment Setup Yes Optimisations of two parameters (θ1, θ2) using simple gradient descent from Flux.jl with learning rate η = 0.01 for the minimal saddle-node and pitchfork models (Figure 1)... optimisation using the ADAM optimiser [32] from Flux.jl with learning rate η = 0.1 converged to one of two clusters... optimisation was restricted to the positive parameter regime by transforming the parameters to log-space θ 10θ. At the beginning of each optimisation run an initial θ was chosen in the log-space by sampling from a multivariate normal distribution with mean zero and standard deviation one.