Random Tensor Theory for Tensor Decomposition
Authors: Mohamed Ouerfelli, Mohamed Tamaazousti, Vincent Rivasseau7913-7921
AAAI 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | we show improvements with respect to state of the art on synthetic and real data which also highlights a promising potential for practical applications. Numerical Experiments In this section we will investigate the empirical results of the previously mentioned applications in order to see if they match with our theoretical results. |
| Researcher Affiliation | Collaboration | 1Universit e Paris-Saclay, CEA, List, F-91120 Palaiseau, France 2Universit e Paris-Saclay, CNRS/IN2P3, IJCLab, 91405 Orsay, France |
| Pseudocode | Yes | Algorithm 1: Recovery algorithm associated to the graph G and edge e Input: The tensor T = βv k + Z Goal: Estimate v0. Calculate the matrix MG,e(T) Compute its top eigenvector by matrix power iteration (repeat vi Mijvj). Output: Obtaining an estimated vector v |
| Open Source Code | No | No explicit statement about providing open-source code or a link to a code repository for the methodology described in the paper was found. |
| Open Datasets | Yes | We carried out experiments on structured real data, the Yale Face Database B (Lee, Ho, and Kriegman 2005). |
| Dataset Splits | No | The paper describes running experiments multiple times (e.g., '200 experiments', '100 times each instance') but does not specify training, validation, or test set splits (e.g., percentages or sample counts). |
| Hardware Specification | No | No specific hardware details (e.g., GPU/CPU models, memory, or cloud instance types) used for running the experiments are mentioned. |
| Software Dependencies | No | No specific software dependencies with version numbers are mentioned in the paper. |
| Experiment Setup | No | The paper lacks specific experimental setup details such as hyperparameters (learning rate, batch size, epochs), optimizer settings, or other configuration parameters. It mentions general settings like 'n=100' or 'different beta' but not fine-grained setup. |