Matrix Factorization with Scale-Invariant Parameters

Authors: Guangxiang Zeng, Hengshu Zhu, Qi Liu, Ping Luo, Enhong Chen, Tong Zhang

IJCAI 2015 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Extensive experiments on real-world dataset clearly validate both the effectiveness and efficiency of our method.
Researcher Affiliation Collaboration 1School of Computer Science and Technology, University of Science and Technology of China, zgx@mail.ustc.edu.cn, qiliuql@ustc.edu.cn, cheneh@ustc.edu.cn 2Baidu Research-Big Data Lab, zhuhengshu@baidu.com, zhangtong10@baidu.com 3Key Lab of Intelligent Information Processing of Chinese Academy of Sciences, Institute of Computing Technology, Chinese Academy of Sciences, luop@ict.ac.cn
Pseudocode Yes Algorithm 1 Scale-Invariant Matrix Factorization (FAVA).
Open Source Code No The paper does not provide any explicit statements about releasing source code or links to a code repository for the methodology described.
Open Datasets Yes Here we use Movie Lens10M [Miller et al., 2003] dataset for validation.
Dataset Splits Yes Specifically, we randomly splited the large dataset into training set and test set (80% for training, 20% for test). All the sub-matrices was sampled from the training set. ... Unless otherwise noted, all methods conducted 5-fold cross-validation when they run on the sub-matrix datasets.
Hardware Specification No The paper does not provide specific details about the hardware (e.g., CPU, GPU models, memory) used to run the experiments.
Software Dependencies No The paper does not specify any software dependencies with version numbers (e.g., Python version, specific libraries or frameworks like PyTorch, TensorFlow, or scikit-learn versions).
Experiment Setup Yes In this experiment, we tuned parameters for TTF and TBF on the sub-matrices first, let λ variate from 0.1 to 4.0 by step size 0.1, and γ {10, 20, 30, 40, 50, 60, 70, 80, 90, 100, 200, 300, 400, 500, 600, 700, 800, 900, 1000, 2000, 3000, 4000, 5000, 6000, 7000, 8000, 9000, 10000}. The convergence level of all methods is set to 10 5.