Singleshot : a scalable Tucker tensor decomposition

Authors: Abraham Traore, Maxime Berar, Alain Rakotomamonjy

NeurIPS 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental The scalabilty of the proposed approaches, which can be easily extended to handle some common constraints encountered in tensor decomposition (e.g non-negativity), is proven via numerical experiments on both synthetic and real data sets.
Researcher Affiliation Collaboration Abraham Traoré LITIS EA4108 University of Rouen Normandy abraham.traore@etu.univ-rouen.fr Maxime Bérar LITIS EA4108 University of Rouen Normandy maxime.berar@univ-rouen.fr Alain Rakotomamonjy LITIS EA4108 University of Rouen Normandy Criteo AI Lab, Criteo Paris alain.rakoto@insa-rouen.fr
Pseudocode Yes Algorithm 1 Singleshot
Open Source Code No The paper mentions using the Tensor Ly tool, but it does not state that the authors are releasing their own source code for the methodology described in the paper.
Open Datasets Yes The experiments are performed on the Movielens dataset [15]
Dataset Splits Yes averaged over five 50 50 train-test random splits
Hardware Specification No Experiments have been run on Mac Os with 32 Gb of memory. This mentions the operating system and memory size, but lacks specific hardware details such as CPU or GPU models.
Software Dependencies No For the tensor computation, we have used the Tensor Ly tool [22]. No specific version number for Tensor Ly or other software dependencies is provided.
Experiment Setup Yes For Singleshot, the steps are fixed to 10 6. For Singleshot-inexact, the steps are respectively fixed to 10 6 and 10 8 for Enron and Movielens. The core G rank is (5, 5, 5).