Learning Neural Contracting Dynamics: Extended Linearization and Global Guarantees
Authors: Sean Jaffe, Alexander Davydov, Deniz Lapsekili, Ambuj K Singh, Francesco Bullo
NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We demonstrate the performance of ELCD on the high dimensional LASA, multi-link pendulum, and Rosenbrock datasets. |
| Researcher Affiliation | Academia | 1 Center for Control, Dynamical Systems and Computation, University of California, Santa Barbara. 2 Department of Computer Science, University of California, Santa Barbara. |
| Pseudocode | No | The paper describes the proposed model and its components but does not include any explicitly labeled pseudocode or algorithm blocks. |
| Open Source Code | Yes | The code for our model and data can be found here: https://github.com/seanjaffe1/Extended-Linearized-Contracting-Dynamics. |
| Open Datasets | Yes | We experiment with the LASA dataset [26]... Alongside the new ELCD model, we also provide the n-link pendulum dataset and Rosenbrock dataset in the supplementary material of the submission. |
| Dataset Splits | No | The paper states that "Each model is trained on all trajectories of the same dimension" but does not provide specific train/validation/test dataset splits (e.g., percentages or counts) for the datasets used. |
| Hardware Specification | No | All computation is done on CPUs. This statement is too general and does not provide specific CPU models, memory, or other detailed hardware specifications. |
| Software Dependencies | No | The paper mentions "Adam optimizer" and refers to "M-flow [8] code" but does not specify version numbers for these or other key software dependencies required for replication. |
| Experiment Setup | Yes | All experiments are trained with a batch size of 100, for 100 epochs, with an Adam optimizer and learning rate of 10 3. |