Liquid Time-constant Networks

Authors: Ramin Hasani, Mathias Lechner, Alexander Amini, Daniela Rus, Radu Grosu7657-7666

AAAI 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We then conduct a series of time-series prediction experiments to manifest the approximation capability of Liquid Time-Constant Networks (LTCs) compared to classical and modern RNNs.
Researcher Affiliation Academia 1 Massachusetts Institute of Technology (MIT) 2 Institute of Science and Technology Austria (IST Austria) 3 Technische Universita t Wien (TU Wien)
Pseudocode Yes Algorithm 1 LTC update by fused ODE Solver
Open Source Code No The paper does not include a specific link to a code repository or an explicit statement about releasing the source code for their methodology.
Open Datasets Yes We use the Human Activity dataset described in (Rubanova, Chen, and Duvenaud 2019)
Dataset Splits No The paper does not explicitly provide specific percentages, sample counts, or citations to predefined splits for training, validation, and test sets. It mentions 'The experimental setup are provided in Appendix' but the appendix content is not accessible here.
Hardware Specification No The paper does not provide specific hardware details such as GPU model numbers, CPU types, or memory specifications used for running the experiments.
Software Dependencies No The paper mentions various software components and libraries such as 'torchdiffeq', 'Openai gym', 'Mu Jo Co physics engine', and 'Adam', along with specific ODE solvers, but does not provide their version numbers.
Experiment Setup No The paper states 'The experimental setup are provided in Appendix' but the appendix content is not accessible. The provided text does not contain specific hyperparameter values or detailed training configurations for the experiments.