Many-body Approximation for Non-negative Tensors

Authors: KAZU GHALAMKARI, Mahito Sugiyama, Yoshinobu Kawahara

NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We demonstrate the effectiveness of our approach in tensor completion and approximation. 1 Introduction... 3 Experiments We conducted three experiments to see the usefulness and effectiveness of many-body approximation.
Researcher Affiliation Academia Kazu Ghalamkari RIKEN AIP kazu.ghalamkari@riken.jp Mahito Sugiyama National Institute of Informatics SOKENDAI mahito@nii.ac.jp Yoshinobu Kawahara Osaka University RIKEN AIP kawahara@ist.osaka-u.ac.jp
Pseudocode Yes Algorithm 1: Low-body tensor completion... Algorithm 2: Many-body approximation
Open Source Code No The paper does not provide an explicit statement or link for the open-sourcing of the proposed methodology's code.
Open Datasets Yes We downloaded the COIL-100 dataset from the official website.1 We downloaded traffic speed records in District 7, Los Angeles County, from Pe MS.2 TT_Chart Res and TT_Origami are seventh-order tensors that are produced from Tokyo Tech Hyperspectral Image Dataset [23, 24].
Dataset Splits No The paper mentions 'missing rates' and 'recovery fit score' for tensor completion, where data is artificially missing for evaluation, but does not specify standard training, validation, and test dataset splits or cross-validation methodologies.
Hardware Specification Yes Experiments were conducted on Ubuntu 20.04.1 with a single core of 2.1GHz Intel Xeon CPU Gold 5218 and 128GB of memory.
Software Dependencies No The paper states 'All methods are implemented in Julia 1.8.' but does not list any specific versions for libraries, frameworks, or other ancillary software components used for the experiments.
Experiment Setup Yes The termination criterion is the same as the original implementation of Legendre decomposition by [36]; that is, it terminates if ||ηB t ˆηB|| < 10 5... We randomly initialized each missing element in P(t=1) by Gaussian distribution with mean 50 and variance 10... Ha LRTC requires a real value ρ and a weight vector α. We used the default setting on α described in the original paper and tuned the value ρ as ρ = 10 5, 10 4, . . . , 102.