Functional Bayesian Tucker Decomposition for Continuous-indexed Tensor Data

Authors: Shikai Fang, Xin Yu, Zheng Wang, Shibo Li, Mike Kirby, Shandian Zhe

ICLR 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental The advantage of our method is shown in both synthetic data and several realworld applications. We release the code of Fun Ba T at https://github.com/ xuangu-fang/Functional-Bayesian-Tucker-Decomposition
Researcher Affiliation Academia University of Utah, Salt Lake City, UT 84112, USA {shikai, xiny, wzuht, shibo, kirby, zhe}@cs.utah.edu
Pseudocode Yes Algorithm 1 Fun Ba T
Open Source Code Yes We release the code of Fun Ba T at https://github.com/ xuangu-fang/Functional-Bayesian-Tucker-Decomposition
Open Datasets Yes Datasets We evaluated Fun Ba T on four real-world datasets: Beijing Air-PM2.5, Beijing Air-PM10, Beijing Air-SO2 and US-TEMP. The first three are extracted from Beijing Air1... We obtain US-TEMP from the Climate Change2. 1https://archive.ics.uci.edu/ml/datasets/Beijing+Multi-Site+ Air-Quality+Data 2https://berkeleyearth.org/data/
Dataset Splits Yes Following (Tillinghast et al., 2020), we randomly sampled 80% observed entry values for training and then tested on the remaining.
Hardware Specification No The paper does not specify the hardware (e.g., GPU, CPU models, memory) used for running the experiments.
Software Dependencies No The paper mentions "We used Py Torch to implement Fun Ba T", but it does not provide specific version numbers for PyTorch or any other software dependencies.
Experiment Setup Yes We re-scaled all continuous-mode indexes to [0, 1] to ensure numerical robustness. For Fun Ba T, we varied Matérn kernels ν = {1/2, 3/2} along the kernel parameters for optimal performance for different datasets. We examined all the methods with rank R {2, 3, 5, 7}. We set all modes ranks to R.