TimesNet: Temporal 2D-Variation Modeling for General Time Series Analysis

Authors: Haixu Wu, Tengge Hu, Yong Liu, Hang Zhou, Jianmin Wang, Mingsheng Long

ICLR 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Our proposed Times Net achieves consistent state-of-the-art in five mainstream time series analysis tasks, including shortand long-term forecasting, imputation, classification, and anomaly detection. Code is available at this repository: https://github.com/thuml/Times Net.
Researcher Affiliation Academia Haixu Wu , Tengge Hu , Yong Liu , Hang Zhou, Jianmin Wang, Mingsheng Long B School of Software, BNRist, Tsinghua University, Beijing 100084, China {whx20,liuyong21,htg21,h-zhou18}@mails.tsinghua.edu.cn {jimwang,mingsheng}@tsinghua.edu.cn
Pseudocode No The paper describes the Times Block and its process using mathematical equations and textual explanations, but it does not include a formal pseudocode or algorithm block.
Open Source Code Yes Code is available at this repository: https://github.com/thuml/Times Net.
Open Datasets Yes For the long-term setting, we follow the benchmarks used in Autoformer (2021), including ETT (Zhou et al., 2021), Electricity (UCI), Traffic (Pe MS), Weather (Wetterstation), Exchange (Lai et al., 2018) and ILI (CDC), covering five real-world applications. For the short-term dataset, we adopt the M4 (Spyros Makridakis, 2018), which contains the yearly, quarterly and monthly collected univariate marketing data. We select 10 multivariate datasets from UEA Time Series Classification Archive (Bagnall et al., 2018). We compare models on five widely-used anomaly detection benchmarks: SMD (Su et al., 2019), MSL (Hundman et al., 2018), SMAP (Hundman et al., 2018), SWa T (Mathur & Tippenhauer, 2016), PSM (Abdulaal et al., 2021).
Dataset Splits Yes Table 6: Dataset descriptions. The dataset size is organized in (Train, Validation, Test). Example for ETTm1, ETTm2: (34465, 11521, 11521).
Hardware Specification Yes All experiments are repeated three times, implemented in Py Torch (Paszke et al., 2019) and conducted on a single NVIDIA TITAN RTX 24GB GPU.
Software Dependencies Yes All experiments are repeated three times, implemented in Py Torch (Paszke et al., 2019) and conducted on a single NVIDIA TITAN RTX 24GB GPU.
Experiment Setup Yes Table 7: Experiment configuration of Times Net. All the experiments use the ADAM (2015) optimizer with the default hyperparameter configuration for (β1, β2) as (0.9, 0.999). It specifies 'k (Equ. 1) Layers dmin dmax LR Loss Batch Size Epochs' for different tasks.