Forecasting Sequential Data Using Consistent Koopman Autoencoders
Authors: Omri Azencot, N. Benjamin Erichson, Vanessa Lin, Michael Mahoney
ICML 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We evaluate our method on a wide range of high-dimensional and short-term dependent problems, and it achieves accurate estimates for significant prediction horizons, while also being robust to noise. |
| Researcher Affiliation | Academia | 1Department of Mathematics at UC Los Angeles, CA, USA. 2ICSI and Department of Statistics at UC Berkeley, CA, USA. |
| Pseudocode | No | The paper does not contain any structured pseudocode or algorithm blocks. |
| Open Source Code | Yes | Our code is available at github.com/erichson/koopman AE. |
| Open Datasets | Yes | We extract a subset of the NOAA OI SST V2 High Resolution Dataset hereafter SST, and we refer to (Reynolds et al., 2007) for additional details. |
| Dataset Splits | No | The paper does not explicitly provide details about a validation dataset split. It mentions splitting data into training and test sets but omits specific validation split information. |
| Hardware Specification | No | The paper does not provide specific hardware details such as exact GPU/CPU models, processor types, or memory amounts used for running its experiments. |
| Software Dependencies | No | The paper does not provide specific ancillary software details with version numbers, such as library or solver names and their exact versions. |
| Experiment Setup | Yes | Our network minimizes Eq. (13) with a decaying learning rate initially set to 0.01. We fix the loss weights to λid = λfwd = 1, λbwd = 0.1, and λcon = 0.01, for the AE, forward forecast, backward prediction and consistency, respectively. We use λs = 8 prediction steps forward and backward in time. |