Notice: The reproducibility variables underlying each score are classified using an automated LLM-based pipeline, validated against a manually labeled dataset. LLM-based classification introduces uncertainty and potential bias; scores should be interpreted as estimates. Full accuracy metrics and methodology are described in [1].

SSMF: Shifting Seasonal Matrix Factorization

Authors: Koki Kawabata, Siddharth Bhatia, Rui Liu, Mohit Wadhwa, Bryan Hooi

NeurIPS 2021 | Venue PDF | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We demonstrate that our algorithm outperforms state-of-the-art baseline methods by accurately forecasting upcoming events on three real-world data streams.
Researcher Affiliation Academia Koki Kawabata SANKEN Osaka University EMAIL Siddharth Bhatia National University of Singapore EMAIL Rui Liu National University of Singapore EMAIL Mohit Wadhwa EMAIL Bryan Hooi National University of Singapore EMAIL
Pseudocode Yes Algorithm 1 SSMF
Open Source Code Yes Reproducibility: Our datasets and source code are publicly available at: https://www.github. com/kokikwbt/ssmf
Open Datasets Yes Table 2: Dataset Description... NYC-YT.3 https://www1.nyc.gov/site/tlc/about/tlc-trip-record-data.page NYC-CB.4 https://s3.amazonaws.com/tripdata/index.html DISEASE.5 https://www.tycho.pitt.edu/data/
Dataset Splits Yes For SSMF and SMF, we determined a learning rate α in α = {0.1, 0.2, 0.3, 0.4} by cross-validation in each training data.
Hardware Specification Yes We implemented our algorithm in Python (ver. 3.7.4) and all the experiments were conducted on an Intel Xeon W-2123 3.6GHz quad-core CPU with 128GB of memory and running Linux.
Software Dependencies Yes We implemented our algorithm in Python (ver. 3.7.4) and all the experiments were conducted on an Intel Xeon W-2123 3.6GHz quad-core CPU with 128GB of memory and running Linux.
Experiment Setup Yes For TRMF, we searched for the best three regularization coefficients, λI, λAR, λLag, in λ = {0.0001, 0.001, 0.01, 0.1, 1, 10}... For SSMF and SMF, we determined a learning rate α in α = {0.1, 0.2, 0.3, 0.4}... The number of components k was set to 15 among all these methods for a fair comparison.