Meta-Learning-Based Adaptive Stability Certificates for Dynamical Systems

Authors: Amit Jena, Dileep Kalathil, Le Xie

AAAI 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We demonstrate the stability assessment performance of meta-NLFs on some standard benchmark autonomous dynamical systems. ... We compare the performance of the meta-NLF approach with other baseline methods on various benchmark control systems to demonstrate the efficacy of our method. ... Experiments We demonstrate the performance of meta-NLF with other standard stability assessment methods on various closed-loop dynamical systems following definition (1).
Researcher Affiliation Academia Amit Jena, Dileep Kalathil, Le Xie Department of Electrical and Computer Engineering, Texas A&M University, USA amit jena@tamu.edu, dileep.kalathil@tamu.edu, le.xie@tamu.edu
Pseudocode Yes Algorithm 1: Meta-NLF Training Algorithm
Open Source Code Yes Our codes and appendix are available at https://github.com/amitjena1992/Meta-NLF.
Open Datasets No The paper describes generating data for training by sampling system parameters and states ('For training, we consider multiple tasks by sampling the system parameters ϑi N(ϑo, Σϑ). For i th task pertaining to parameter ϑi, we sample the state xi,j from the domain D and get yi,j = fϑi(xi,j) to constitute the data sample zi,j = (xi,j, yi,j)'). It does not refer to or provide access information for any publicly available or open dataset.
Dataset Splits No The paper does not provide specific details about training, validation, or test dataset splits (e.g., percentages or sample counts). It mentions mini-batches and samples for adaptation within the MAML framework ('Each mini data-batch (Str i,j, Ste i,j) comprises of K and J number of samples'), but not predefined dataset splits.
Hardware Specification No The paper does not provide specific details about the hardware used for experiments, such as CPU models, GPU models, or memory specifications. It only mentions 'training' and 'simulations' without hardware context.
Software Dependencies No The paper does not list specific software dependencies with version numbers (e.g., Python version, specific deep learning frameworks like PyTorch or TensorFlow versions, or solver versions).
Experiment Setup Yes Algorithm 1 details 'adaptation and meta step sizes (α, α), meta-training batch size P, total meta-training steps K.' The 'Simulations' section provides system-specific parameters and settings: 'we assume l to be stochastic and create the nominal and test-time systems by setting ϑ0 = (l, m, g, b)0 = (0.5, 0.15, 9, 81, 0.1) and ϑn+1 = (1.2, 0.15, 9.81, 0.1).'