Dimensionality Reduction for Stationary Time Series via Stochastic Nonconvex Optimization

Authors: Minshuo Chen, Lin Yang, Mengdi Wang, Tuo Zhao

NeurIPS 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Numerical experiments are provided to support our analysis. We demonstrate the effectiveness of our proposed algorithm using both simulated and real datasets.
Researcher Affiliation Academia Minshuo Chen1 Lin F. Yang2 Mengdi Wang2 Tuo Zhao1 1Georgia Institute of Technology 2Princeton University 1{mchen393, tourzhao}@gatech.edu 2{lin.yang, mengdiw}@princeton.edu
Pseudocode Yes Algorithm 1 Downsampled Oja s Algorithm
Open Source Code No The paper does not provide any statement or link indicating that the source code for the methodology is openly available.
Open Datasets Yes We adopt the Air Quality dataset (De Vito et al., 2008)
Dataset Splits No The paper does not provide specific details regarding training, validation, or test dataset splits (e.g., percentages, sample counts, or explicit splitting methodology).
Hardware Specification No The paper does not provide specific details about the hardware used to run the experiments, such as GPU/CPU models or memory specifications.
Software Dependencies No The paper does not specify any software dependencies with version numbers (e.g., library names, framework versions, or programming language versions with libraries).
Experiment Setup Yes The step size is = 3 10 5, and the algorithm runs with 8 105 total samples. Specifically, we set the step size = 0 h 4000 if k < 2 104, = 0 h 8000 if k 2 [2 104, 5 104), = 0 h 48000 if k 2 [5 104, 10 104), and = 0 h 120000 if k 10 104. We choose 0 in {0.125, 0.25, 0.5, 1, 2} and report the final principle angles achieved by different block sizes h in Table 1.