SSMF: Shifting Seasonal Matrix Factorization

Authors: Koki Kawabata, Siddharth Bhatia, Rui Liu, Mohit Wadhwa, Bryan Hooi

NeurIPS 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We demonstrate that our algorithm outperforms state-of-the-art baseline methods by accurately forecasting upcoming events on three real-world data streams.
Researcher Affiliation Academia Koki Kawabata SANKEN Osaka University koki@sanken.osaka-u.ac.jp Siddharth Bhatia National University of Singapore siddharth@comp.nus.edu.sg Rui Liu National University of Singapore xxliuruiabc@gmail.com Mohit Wadhwa mailmohitwadhwa@gmail.com Bryan Hooi National University of Singapore bhooi@comp.nus.edu.sg
Pseudocode Yes Algorithm 1 SSMF
Open Source Code Yes Reproducibility: Our datasets and source code are publicly available at: https://www.github. com/kokikwbt/ssmf
Open Datasets Yes Table 2: Dataset Description... NYC-YT.3 https://www1.nyc.gov/site/tlc/about/tlc-trip-record-data.page NYC-CB.4 https://s3.amazonaws.com/tripdata/index.html DISEASE.5 https://www.tycho.pitt.edu/data/
Dataset Splits Yes For SSMF and SMF, we determined a learning rate α in α = {0.1, 0.2, 0.3, 0.4} by cross-validation in each training data.
Hardware Specification Yes We implemented our algorithm in Python (ver. 3.7.4) and all the experiments were conducted on an Intel Xeon W-2123 3.6GHz quad-core CPU with 128GB of memory and running Linux.
Software Dependencies Yes We implemented our algorithm in Python (ver. 3.7.4) and all the experiments were conducted on an Intel Xeon W-2123 3.6GHz quad-core CPU with 128GB of memory and running Linux.
Experiment Setup Yes For TRMF, we searched for the best three regularization coefficients, λI, λAR, λLag, in λ = {0.0001, 0.001, 0.01, 0.1, 1, 10}... For SSMF and SMF, we determined a learning rate α in α = {0.1, 0.2, 0.3, 0.4}... The number of components k was set to 15 among all these methods for a fair comparison.