Whittle Networks: A Deep Likelihood Model for Time Series

Authors: Zhongjie Yu, Fabrizio G Ventola, Kristian Kersting

ICML 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Our experimental results on stock market data, synthetic time series, MNIST, and hyperspectral images demonstrate that Whittle Networks can indeed capture complex dependencies between time series and provide a useful measure of uncertainty for neural networks.
Researcher Affiliation Academia Zhongjie Yu 1 Fabrizio Ventola 1 Kristian Kersting 1 2 1Department of Computer Science, TU Darmstadt, Darmstadt, Germany 2Centre for Cognitive Science, TU Darmstadt, and Hessian Center for AI (hessian.AI). Correspondence to: Zhongjie Yu <yu@cs.tu-darmstadt.de>.
Pseudocode No The paper describes methods and concepts but does not include any explicitly labeled pseudocode or algorithm blocks.
Open Source Code Yes Source code is available at: https://github.com/ ml-research/Whittle Networks
Open Datasets Yes Therefore, we use two real-world market datasets acquired from Yahoo! Finance Data . The first one is the index values of 11 sectors from Standard & Poor s (S&P) from October 16, 2013 to May 24, 2019 (See Fig. 1 (Left)). The second one is the global stock index (Stock) from 17 markets extracted from June 2, 1997 to June 30, 1999. Both S&P and Stock datasets are applied first with log-return transformation, assuming them to be stationary (St aric a & Granger, 2005), and then a sliding window of size 32, ending up in 44 and 50 time series instances.
Dataset Splits No The paper mentions 'training' and 'test' data splits for evaluation (e.g., in Table 1) but does not explicitly specify a 'validation' dataset split for hyperparameter tuning or early stopping.
Hardware Specification No The paper does not provide specific details about the hardware used for running the experiments, such as GPU models, CPU types, or memory specifications.
Software Dependencies No The paper mentions various software tools and implementations like 'Learn SPN', 'Res SPNs', 'RAT-SPN', 'MADE', and 'Open Markov toolbox', but it does not specify concrete version numbers for these software dependencies, which would be necessary for full reproducibility.
Experiment Setup Yes The AE consists of an MLP with the following number of neurons for each layer: 128 64 16 2 16 64 128, using sigmoid as the activation function. and The number of hidden layers in MADE is set to 1 for all datasets, while the hidden units vary from 200 to 600, depending on the number of RVs in each dataset.