SimMTM: A Simple Pre-Training Framework for Masked Time-Series Modeling

Authors: Jiaxiang Dong, Haixu Wu, Haoran Zhang, Li Zhang, Jianmin Wang, Mingsheng Long

NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental To fully evaluate Sim MTM, we conduct experiments on two typical time series analysis tasks: forecasting and classification, covering low-level and high-level representation learning. Further, we present the fine-tuning performance for each task under inand cross-domain settings.
Researcher Affiliation Academia Jiaxiang Dong , Haixu Wu , Haoran Zhang, Li Zhang, Jianmin Wang, Mingsheng Long B School of Software, BNRist, Tsinghua University, China {djx20,z-hr20}@mails.tsinghua.edu.cn, wuhaixu98@gmail.com {lizhang,jimwang,mingsheng}@tsinghua.edu.cn
Pseudocode No The paper describes the Sim MTM framework with equations and textual descriptions but does not include any pseudocode or algorithm blocks.
Open Source Code Yes Code is available at https://github.com/thuml/Sim MTM.
Open Datasets Yes Table 1: Summary of experiment benchmarks. Tasks Datasets Semantic Forecasting ETTh1,ETTh2 Electricity ETTm1,ETTm2 Electricity Weather Weather Electricity Electricity Traffic Transportation Classification Sleep EEG EEG Epilepsy EEG FD-B Faulty Detection Gesture Hand Movement EMG Muscle Responses. A.1 Dataset Description: (1) ETT (4 subsets) [58], (2) WEATHER [43], (3) ELECTRICITY [38], (4) TRAFFIC [29], (5) SLEEPEEG [16], (6) EPILEPSY [1], (7) FD-B [19], (8) GESTURE [22], (9) EMG [30].
Dataset Splits Yes Table 7: Dataset descriptions. Samples are organized in (Train/Validation/Test).
Hardware Specification Yes All the experiments are repeated five times, implemented in Py Torch [28] and conducted on NVIDIA A100 SXM4 40GB GPU.
Software Dependencies No The paper states 'implemented in Py Torch [28]' but does not provide a specific version number for PyTorch or any other software dependencies beyond the general library name.
Experiment Setup Yes Table 10: Model and training configuration in Forecasting (Fore.) and Classification (Class.) tasks. Encoder Pre-training Fine-tuning elayers dmodel learning rate batch size epochs learning rate loss function batch size epochs Fore. 2 16 1e-3 32 50 1e-4 L2 {16,32} 10 Class. 3 128 1e-4 128 10 1e-4 Cross-Entropy 32 300