Learning Dynamics Models with Stable Invariant Sets

Authors: Naoya Takeishi, Yoshinobu Kawahara9782-9790

AAAI 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We present experimental results in Section 6, with which we can confirm the validity of the proposed method and its usefulness for the application of long-term prediction.
Researcher Affiliation Academia Naoya Takeishi 1,2, Yoshinobu Kawahara 3,1 1 RIKEN Center for Advanced Intelligence Project 2 University of Applied Sciences and Arts Western Switzerland 3 Kyushu University naoya.takeishi@riken.jp, kawahara@imi.kyushu-u.ac.jp
Pseudocode No The paper describes the proposed method in prose and with a diagram (Figure 1) but does not include any formal pseudocode or algorithm blocks.
Open Source Code No The paper mentions 'Other details are found in the appendix' in Section 6.1, but does not explicitly state that source code for the described methodology is publicly available, nor does it provide a direct link to a repository.
Open Datasets No The paper describes generating its own datasets for experiments (e.g., 'We generated four sequences...', 'As training data, we generated such flow...'), but it does not provide any concrete access information (link, DOI, repository, or citation with author/year for a public dataset) for these datasets.
Dataset Splits No The paper mentions 'training data' and 'test data' (e.g., 'used the pairs of x and x as training data' in Section 6.2, and 'results of long-term prediction given only xt=0 that was not in the training data' for test data), but no specific 'validation' set or split information is provided.
Hardware Specification No The paper does not provide specific details about the hardware used for running experiments, such as GPU models, CPU types, or memory specifications.
Software Dependencies No The paper mentions implementing components with 'neural networks' and using 'exponential linear unit as the activation function', but it does not specify any software libraries (e.g., PyTorch, TensorFlow) or their version numbers.
Experiment Setup No The paper states 'Other details are found in the appendix' regarding implementation and configuration (Section 6.1) but does not provide specific hyperparameter values (e.g., learning rate, batch size, number of epochs) or other detailed training settings within the main text.