Notice: The reproducibility variables underlying each score are classified using an automated LLM-based pipeline, validated against a manually labeled dataset. LLM-based classification introduces uncertainty and potential bias; scores should be interpreted as estimates. Full accuracy metrics and methodology are described in [1].
Koopman Kernel Regression
Authors: Petar Bevanda, Max Beier, Armin Lederer, Stefan Sosnowski, Eyke Hüllermeier, Sandra Hirche
NeurIPS 2023 | Venue PDF | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Our experiments demonstrate superior forecasting performance compared to Koopman operator and sequential data predictors in RKHS. |
| Researcher Affiliation | Academia | Petar Bevanda TU Munich EMAIL Max Beier TU Munich EMAIL Armin Lederer TU Munich EMAIL Stefan Sosnowski TU Munich EMAIL Eyke Hüllermeier LMU Munich EMAIL Sandra Hirche TU Munich EMAIL |
| Pseudocode | Yes | For a better overview, the pseudocode for regression and forecasting of our method are shown in Algorithm 1. Algorithm 1 Regression and LTI Forecasts using KKR |
| Open Source Code | Yes | Along with code for reproduction of our experiments, we provide a JAX [76] reliant Python module implementing a sklearn [77] compliant KKR estimator at https://github.com/TUM-ITR/koopcore. |
| Open Datasets | No | Sample Trajectories, N=200, H=14, Flow past a cylinder data. The paper refers to data used but does not provide concrete access information (link, DOI, specific repository, or formal citation with authors/year) for a publicly available or open dataset. |
| Dataset Splits | No | Sample trajectories both on training and testing data. The paper refers to training and testing data and discusses generalization, but does not provide specific details about validation data splits (percentages, counts, or methodology for creation). |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., GPU/CPU models, processor types) used for running its experiments. |
| Software Dependencies | No | we provide a JAX [76] reliant Python module implementing a sklearn [77] compliant KKR estimator at https://github.com/TUM-ITR/koopcore. The paper mentions software components (JAX, sklearn) but does not provide specific version numbers for them. |
| Experiment Setup | Yes | For fairness, the same kernel and hyperparameters are chosen for our KKR, PCR (EDMD), RRR [23] and regression with signature kernels (Sig-PDE) [48]... for respectively optimal DKKR=100, DPCR=10 and 15 delays for Sig-PDEs... RBF kernel lengthscale ℓ |