Nonparametric Embeddings of Sparse High-Order Interaction Events

Authors: Zheng Wang, Yiming Xu, Conor Tillinghast, Shibo Li, Akil Narayan, Shandian Zhe

ICML 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental For evaluation, we conducted simulations to demonstrate that our theoretical bounds can indeed match the actual sparsity ratio and capture the asymptotic trend. Hence they can provide a reasonable convergence rate estimate and characterize the behavior of the prior. We then tested our approach NESH on three real-world datasets. NESH achieves much better predictive performance than the existing methods that use Poisson tensor factorization, additional time steps, local time dependency windows and triggering kernels.
Researcher Affiliation Academia 1School of Computing, University of Utah 2Department of Mathematics, University of Utah 3Scientific Computing and Imaging (SCI) Institute, University of Utah.
Pseudocode No The paper describes the algorithm steps in text (Section 4 Algorithm) and uses mathematical equations, but there are no formally labeled pseudocode or algorithm blocks.
Open Source Code No The paper does not provide any statement or link indicating the availability of open-source code for the methodology described.
Open Datasets Yes We then examined the predictive performance of NESH on the following real-world datasets. (1) Taobao (https://tianchi.aliyun.com/dataset/ data Detail?data Id=53),... (2) Crash (https://www.kaggle.com/ usdot/nhtsa-traffic-fatalities),... (3) Retail (https://tianchi.aliyun.com/dataset/ data Detail?data Id=37260),
Dataset Splits No We randomly split each dataset into 80% sequences for training, and the remaining 20% for test. The paper specifies training and test splits but does not explicitly mention a separate validation split.
Hardware Specification No The paper does not provide specific details about the hardware (e.g., CPU, GPU models, memory) used to run the experiments. It only mentions implementing methods with PyTorch.
Software Dependencies No We implemented NESH, HP-Local, HP-TF and MGP-EF with Pytorch (Paszke et al., 2019), and the other methods with MATLAB. The software is mentioned, but specific version numbers are not provided for reproducibility.
Experiment Setup Yes Specifically, we introduce a small set of pseudo inputs Z = [z1, . . . , zh] for f( ), where h is far less than the dimension of f. We then define the pseudo outputs b = [f(z1), . . . , f(zh)] . ... We set the number of pseudo inputs to 100. We used the square exponential (SE) kernel and initialized the kernel parameters with 1. For HP-Local, the local window size was set to 50. For our method, we chose α from {0.5, 1.0, 1.5, 2.5, 3}. We conducted stochastic mini-batch optimization for all the methods, where the batch size was set to 100. We used ADAM (Kingma and Ba, 2014) algorithm, and the learning rate was tuned from {5 × 10−4, 10−3, 3 × 10−3, 5 × 10−3, 10−2}. We ran each method for 400 epochs, which is enough to converge. We randomly split each dataset into 80% sequences for training, and the remaining 20% for test. We varied R, the dimension of the embeddings, from {2, 5, 8, 10}.