Matrix Completion has No Spurious Local Minimum

Authors: Rong Ge, Jason D. Lee, Tengyu Ma

NeurIPS 2016 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical We prove that the commonly used non-convex objective function for positive semidefinite matrix completion has no spurious local minima all local minima must also be global. Therefore, many popular optimization algorithms such as (stochastic) gradient descent can provably solve positive semidefinite matrix completion with arbitrary initialization in polynomial time. The result can be generalized to the setting when the observed entries contain noise. We believe that our main proof strategy can be useful for understanding geometric properties of other statistical problems involving partial or noisy observations.
Researcher Affiliation Academia Rong Ge Duke University 308 Research Drive, NC 27708 rongge@cs.duke.edu. Jason D. Lee University of Southern California 3670 Trousdale Pkwy, CA 90089 jasonlee@marshall.usc.edu. Tengyu Ma Princeton University 35 Olden Street, NJ 08540 tengyu@cs.princeton.edu.
Pseudocode No The paper describes conceptual approaches to optimization but does not provide any structured pseudocode or algorithm blocks.
Open Source Code No The paper does not contain any statement about releasing source code or provide links to a code repository for the described methodology.
Open Datasets No This is a theoretical paper and does not involve training models on a specific dataset or provide information about dataset availability.
Dataset Splits No This is a theoretical paper and does not involve experimental validation or dataset splits.
Hardware Specification No This is a theoretical paper and does not mention any specific hardware used for experiments.
Software Dependencies No This is a theoretical paper and does not list any specific software dependencies with version numbers.
Experiment Setup No This is a theoretical paper and does not provide details about an experimental setup, such as hyperparameters or training configurations.