Decoupled Invariant Attention Network for Multivariate Time-series Forecasting

Authors: Haihua Xu, Wei Fan, Kun Yi, Pengyang Wang

IJCAI 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Extensive experiments on five datasets have demonstrated our superior performance with higher efficiency compared with state-of-the-art methods.
Researcher Affiliation Academia 1Department of Computer and Information Science, University of Macau, China 2The State Key Laboratory of Internet of Things for Smart City, University of Macau, China 3University of Central Florida, USA 4Beijing Institute of Technology, China
Pseudocode No The paper describes the methodology using prose and mathematical equations but does not include any explicitly labeled pseudocode or algorithm blocks.
Open Source Code Yes Additional analysis on distribution shift in time series is provided in Appendix1. 1https://github.com/xhh39/DIAN
Open Datasets Yes We evaluate our proposed method on five real-world datasets and use the min-max normalization to normalize all these datasets. ... More detailed information about the datasets is provided in Appendix1. 1https://github.com/xhh39/DIAN
Dataset Splits Yes Except for the COVID-19 dataset, we split the datasets into training, validation, and test sets with the ratio of 7:2:1 in a chronological order. For the COVID-19 dataset, the ratio is 6:2:2 because of the limitation of data scale in temporal dimension.
Hardware Specification Yes All models were evaluated on a Linux server with one RTX 3090 GPU.
Software Dependencies No The paper mentions using 'Py Torch' for implementation but does not specify its version number or any other software dependencies with version details.
Experiment Setup Yes We use MAE (Mean Absolute Errors) as the loss function and the Adam Optimizer with a learning rate of 1e-3 with proper early stopping. For the main experiment, we fix the lookback window length as 12 and the horizon window length as 12.