Exact Low Tubal Rank Tensor Recovery from Gaussian Measurements

Authors: Canyi Lu, Jiashi Feng, Zhouchen Lin, Shuicheng Yan

IJCAI 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental In this section, we conducts experiments to first verify the exact recovery guarantee in Theorem 4 for (3) from Gaussian measurements, then to verify the exact recovery guarantee in Theorem 6 for tensor completion (8). Both (3) and (8) can be solved by the standard ADMM [Lu et al., 2018b]. First, we test on random tensors, provided sufficient number of measurements as suggested in Theorem 4. We generate X 0 Rn n n3 of tubal rank r by X 0 = P Q, where P Rn r n3 and Q Rr n n3 are with i.i.d. standard Gaussian random variables. We generate A Rm (n2n3) with its entries being i.i.d., zero-mean, 1 m-variance Gaussian variables. Then, let y = Avec(X 0). We choose n = 10, 20, 30, n3 = 5, r = 0.2n and r = 0.3n. We set the number of measurements m = 3r(2n r)n3 +1 as in Theorem 4. The results are given in Table 1, in which ˆX is the solution to (11). It can be seen that the relative errors ˆX X 0 F / X 0 F are very small and the tubal ranks of ˆX are correct.
Researcher Affiliation Collaboration Canyi Lu1, Jiashi Feng2, Zhouchen Lin3,4 Shuicheng Yan5,2 1 Department of Electrical and Computer Engineering, Carnegie Mellon University 2 Department of Electrical and Computer Engineering, National University of Singapore 3 Key Laboratory of Machine Perception (MOE), School of EECS, Peking University 4 Cooperative Medianet Innovation Center, Shanghai Jiao Tong University 5 360 AI Institute
Pseudocode Yes Algorithm 1 T-SVD Input: A Rn1 n2 n3. Output: T-SVD components U, S and V of A.
Open Source Code Yes The codes of our methods can be found at https://sites.google.com/site/canyilu/
Open Datasets No The paper generates synthetic data for its experiments, for example: 'We generate X 0 Rn n n3 of tubal rank r by X 0 = P Q, where P Rn r n3 and Q Rr n n3 are with i.i.d. standard Gaussian random variables.' It does not use publicly available datasets.
Dataset Splits No The paper focuses on theoretical recovery guarantees and their empirical verification using generated data, rather than on standard training/validation/test splits of a dataset.
Hardware Specification No The paper does not specify any hardware details (e.g., GPU models, CPU types, or cloud platforms) used for conducting the experiments.
Software Dependencies No The paper mentions 'Matlab command fft' and that the problems can be solved by 'standard ADMM [Lu et al., 2018b]', but it does not provide specific version numbers for MATLAB or any other software libraries or dependencies.
Experiment Setup Yes We generate X 0 Rn n n3 of tubal rank r by X 0 = P Q, where P Rn r n3 and Q Rr n n3 are with i.i.d. standard Gaussian random variables. We generate A Rm (n2n3) with its entries being i.i.d., zero-mean, 1 m-variance Gaussian variables. Then, let y = Avec(X 0). We choose n = 10, 20, 30, n3 = 5, r = 0.2n and r = 0.3n. We set the number of measurements m = 3r(2n r)n3 +1 as in Theorem 4.