Huber Additive Models for Non-stationary Time Series Analysis

Authors: Yingjie Wang, Xianrui Zhong, Fengxiang He, Hong Chen, Dacheng Tao

ICLR 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experimental results on both synthetic and real-world benchmark datasets validate the effectiveness of the proposed method.
Researcher Affiliation Collaboration 1College of Informatics, Huazhong Agricultural University, China 2JD Explore Academy, JD.com Inc, China 3Department of Computer Science, University of Illinoist at Urbana-Champaign, USA 4College of Science, Huazhong Agricultural University, China
Pseudocode Yes Algorithm 1: Optimization procedure for adaptive Sp HAM
Open Source Code Yes The code is available at https://github.com/xianruizhong/Sp HAM.
Open Datasets Yes Experimental results on both synthetic and real-world benchmark Cause Me (Runge et al., 2019) validate the effectiveness of the proposed method. ... We use the Air Quality dataset obtained from UCI Machine Learning Repository (https:// archive.ics.uci.edu/ml/datasets/Air+quality) ... CMEs data are provided in The Richardson and Cane List (http://www.srl.caltech.edu/ACE/ ASC/DATA/level3/icmetable2.htm).
Dataset Splits No The paper specifies a training set and a test set split: 'The samples at time t = {1500, 1501, ..., 1899} are used as a training set, and the samples at next time t = {1900, ..., 1999} are considered as the test data.' However, it does not explicitly mention a validation set or specific validation split.
Hardware Specification No The paper does not explicitly describe the specific hardware (e.g., GPU models, CPU models, or memory) used to run the experiments.
Software Dependencies No The paper mentions 'original python packages' for competing methods and refers to algorithms like FISTA, but it does not specify version numbers for any software dependencies (e.g., Python, PyTorch, TensorFlow, scikit-learn, etc.).
Experiment Setup Yes Recall that Sp HAM algorithm requires three hyper-parameters: regularization parameter λ, bandwidth of kernel d and Huber parameter σ. We set these parameters according to the suggestions in our Theorems. Based on the suggestion in Theorem 1-3, the selection of Huber parameter σ is σ = T 1/48 and the regularization parameter is λ = T 1. Moreover, we set the bandwidth d = 0.5 and tune l {100, 150, 200, ..., 350}.