Notice: The reproducibility variables underlying each score are classified using an automated LLM-based pipeline, validated against a manually labeled dataset. LLM-based classification introduces uncertainty and potential bias; scores should be interpreted as estimates. Full accuracy metrics and methodology are described in [1].
Non-Convex Matrix Completion and Related Problems via Strong Duality
Authors: Maria-Florina Balcan, Yingyu Liang, Zhao Song, David P. Woodruff, Hongyang Zhang
JMLR 2019 | Venue PDF | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | This work studies the strong duality of non-convex matrix factorization problems... We apply our framework to two prototypical matrix factorization problems: matrix completion and robust Principal Component Analysis... Our framework shows that exact recoverability and strong duality hold with nearly-optimal sample complexity for the two problems. ... In this section, we will present our experimental results. 8.1. Experiments on Synthetic Data 8.2. Experiments on Real Data |
| Researcher Affiliation | Academia | Maria-Florina Balcan Carnegie Mellon University, Yingyu Liang University of Wisconsin-Madison, Zhao Song UT-Austin & Harvard University, David P. Woodruff Carnegie Mellon University, Hongyang Zhang Carnegie Mellon University & TTIC |
| Pseudocode | No | The paper describes algorithms and methods but does not include any explicitly labeled pseudocode or algorithm blocks. It references algorithms like the Douglas-Rachford algorithm but does not present them as pseudocode within the paper. |
| Open Source Code | No | The paper does not provide concrete access to source code for the methodology described. It thanks collaborators for providing code for a specific function, but does not state that the authors' own implementation is publicly available. |
| Open Datasets | Yes | To verify the performance of the algorithms on real data, we conduct experiments on the Hopkins 155 data set. |
| Dataset Splits | Yes | We uniformly sample m entries from the matrix as our observations and run the matrix completion algorithms. ... m = 0.05n1n2 m = 0.1n1n2 |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., CPU, GPU models, memory, or cloud instances) used for running the experiments. |
| Software Dependencies | No | The paper mentions using the 'Augmented Lagrange Multiplier Method' and refers to the 'Douglas-Rachford algorithm' but does not specify any software libraries or tools with version numbers. |
| Experiment Setup | Yes | We set the parameter r in r minimization (15) as the true rank, and use the Augmented Lagrange Multiplier Method (Chen et al., 2009) for optimization... The parameter r in the r minimization is set as the number of moving objects which is known to us in the data set. |