Low Rank Matrix Completion via Robust Alternating Minimization in Nearly Linear Time

Authors: Yuzhou Gu, Zhao Song, Junze Yin, Lichen Zhang

ICLR 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical In this paper, we take a major step towards a more efficient and error-robust alternating minimization framework. To this end, we develop an analytical framework for alternating minimization that can tolerate a moderate amount of errors caused by approximate updates. Moreover, our algorithm runs in time e O(|Ω|k), which is nearly linear in the time to verify the solution while preserving the sample complexity. This improves upon all prior known alternating minimization approaches which require e O(|Ω|k2) time.
Researcher Affiliation Collaboration Yuzhou Gu Institute of Advanced Study yuzhougu@ias.edu Zhao Song Adobe Research zsong@adobe.com Junze Yin Boston University junze@bu.edu Lichen Zhang MIT CSAIL lichenz@csail.mit.edu
Pseudocode Yes Algorithm 1 Alternating minimization for matrix completion. The INIT procedure clips the rows with large norms, then performs a Gram-Schmidt process. Algorithm 2 High precision solver. Algorithm 3 Fast, high precision solver for weighted multiple response regression. Algorithm 4 Clipping and Gram-Schmidt. Algorithm 5 Alternating minimization for matrix completion, formal version of Algorithm 1.
Open Source Code No The paper does not contain any explicit statements or links indicating that source code for the described methodology is publicly available.
Open Datasets No The paper is theoretical and does not perform experiments on datasets, thus no training dataset is specified or made available.
Dataset Splits No The paper is theoretical and does not perform experiments on datasets, thus no validation splits are specified.
Hardware Specification No The paper is theoretical and focuses on algorithmic complexity and proofs. It does not mention any specific hardware used for running experiments.
Software Dependencies No The paper is theoretical and does not describe a software implementation or dependencies with version numbers.
Experiment Setup No The paper is theoretical and does not describe an experimental setup with hyperparameters or training settings.