Rough Transformers: Lightweight and Continuous Time Series Modelling through Signature Patching

Authors: Fernando Moreno-Pino, Alvaro Arroyo, Harrison Waldon, Xiaowen Dong, Alvaro Cartea

NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental In this section, we present empirical results for the effectiveness of the Rough Transformer, hereafter denoted RFormer, on a variety of time-series-related tasks.
Researcher Affiliation Academia Fernando Moreno-Pino1, Álvaro Arroyo1, Harrison Waldon1, Xiaowen Dong1,2 Álvaro Cartea1,3 1 Oxford-Man Institute, University of Oxford 2 Machine Learning Research Group, University of Oxford 3 Mathematical Institute, University of Oxford
Pseudocode No The paper does not contain any structured pseudocode or algorithm blocks.
Open Source Code Yes Code available at: https://github.com/AlvaroArroyo/RFormer
Open Datasets Yes Next, we consider the Heart Rate dataset from the TSR archive [83], originally sourced from Beth Israel Deaconess Medical Center (BIDMC).
Dataset Splits Yes As previously done in [57], the original train and test datasets are merged and then randomly divided into new train, validation, and test sets, following a 70/15/15 split.
Hardware Specification Yes All experiments are conducted on an NVIDIA Ge Force RTX 3090 GPU with 24,564 Mi B of memory, utilizing CUDA version 12.3.
Software Dependencies No The paper mentions 'CUDA version 12.3' but does not list other key software dependencies with specific version numbers, such as Python or deep learning frameworks like PyTorch or TensorFlow.
Experiment Setup Yes Experimental and hyperparameter details regarding the implementation of the method are in Appendices C and D. Hyperparameters used to produce the results in Table 2 are reported in Tables 6.