Tensor Completion Made Practical

Authors: Allen Liu, Ankur Moitra

NeurIPS 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental In this section, we describe our experimental results and in particular how our algorithm compares to existing algorithms.
Researcher Affiliation Academia Allen Liu Massachusetts Institute of Technology Cambridge, MA 02139 cliu568@mit.edu Ankur Moitra Department of Mathematics, Massachusetts Institute of Technology Cambridge, MA 02139 moitra@mit.edu
Pseudocode Yes Algorithm 1 FULL EXACT TENSOR COMPLETION
Open Source Code Yes The code for all of the experiments can be found in Section I.
Open Datasets No Uncorrelated tensors: generated by taking T = P4 i=1 xi yi zi where xi, yi, zi are random unit vectors. Correlated tensors: generated by taking T = P4 i=1 0.5i 1xi yi zi where x1, y1, z1 are random unit vectors and for i > 1, xi, yi, zi are random unit vectors that have covariance 0.88 with x1, y1, z1 respectively.
Dataset Splits No The paper mentions using a random subset of observations for alternating minimization steps but does not provide explicit train/validation/test dataset splits, percentages, or absolute counts for reproducibility.
Hardware Specification No The paper does not provide specific hardware details such as CPU/GPU models, memory, or computational resources used for running the experiments.
Software Dependencies No The paper does not specify software dependencies with version numbers, such as programming languages, libraries, or frameworks used for the implementation.
Experiment Setup Yes We ran KRONECKER COMPLETION and STANDARD ALTERNATING MINIMIZATION for n = 200, r = 4 and either 50000 or 200000 observations. We ran 100 trials and took the median normalized MSE... For alternating minimization steps, we use a random subset consisting of half of the observations. We ran KRONECKER COMPLETION for 100 iterations compared to running STANDARD ALTERNATING MINIMIZATION for 400 iterations