Exponentially Convergent Algorithms for Supervised Matrix Factorization

Authors: Joowon Lee, Hanbaek Lyu, Weixin Yao

NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We numerically verify Theorem 3.5 on a semi-synthetic dataset generated by using MNIST image dataset [24] (p = 282 = 784, q = 0, n = 500, = 1) and a text dataset named Real / Fake Job Posting Prediction [1] (p = 2840, q = 72, n = 17880, = 1).
Researcher Affiliation Academia Joowon Lee Department of Statistics University of Wisconsin Madison, WI, USA jlee2256@wisc.edu Hanbaek Lyu Department of Mathematics University of Wisconsin Madison, WI, USA hlyu@math.wisc.edu Weixin Yao Department of Statistics University of California, Riverside, CA, USA weixiny@ucr.edu
Pseudocode Yes Algorithm 1 Lifted PGD for SMF
Open Source Code Yes We provide our implementation of Algorithm 1 in our code repository https://github.com/ljw9510/SMF/tree/main.
Open Datasets Yes We numerically verify Theorem 3.5 on a semi-synthetic dataset generated by using MNIST image dataset [24] (p = 282 = 784, q = 0, n = 500, = 1) and a text dataset named Real / Fake Job Posting Prediction [1] (p = 2840, q = 72, n = 17880, = 1). We apply the proposed methods to two datasets from the Curated Microarray Database (Cu Mi Da) [14].
Dataset Splits Yes Other parameters are chosen through 5-fold cross-validation ( 2 {0.1, 1, 10} and λ 2 {0.1, 1, 10}), and the algorithms are repeated in 1,000 iterations or until convergence.
Hardware Specification No The paper does not provide specific hardware details (exact GPU/CPU models, processor types with speeds, memory amounts, or detailed computer specifications) used for running its experiments.
Software Dependencies No The paper does not provide specific ancillary software details (e.g., library or solver names with version numbers) needed to replicate the experiment.
Experiment Setup Yes For all experiments, λ = 2 and stepsize = 0.01 were used. Other parameters are chosen through 5-fold cross-validation ( 2 {0.1, 1, 10} and λ 2 {0.1, 1, 10}), and the algorithms are repeated in 1,000 iterations or until convergence.