Hierarchical Negative Binomial Factorization for Recommender Systems on Implicit Feedback

Authors: Li-Yen Kuo, Ming-Syan Chen4181-4188

AAAI 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental The experiment shows that the proposed model outperforms state-of-the-art Poisson-based methods merely with a slight loss of inference speed.
Researcher Affiliation Academia Li-Yen Kuo, Ming-Syan Chen National Taiwan University No. 1, Sec. 4, Roosevelt Rd. Taipei 10617, Taiwan
Pseudocode Yes Algorithm 1 Updating of Fast HNBF
Open Source Code Yes Source code and supplementary materials can be downloaded at https://github.com/iankuoli/HNBF
Open Datasets Yes The statistics are shown in Table 3. The first three datasets are implicit count data. Last.fm1K... Last.fm2K... Last.fm360K... Movie Lens100K... Movie Lens1M... Movie Lens20M... Jester2... Each Movie...
Dataset Splits Yes We follow the works (Gopalan et al. 2014; Basbug and Engelhardt 2016, 2017) to randomly select 20% of nonzero entries for each dataset to be used as a test set, and randomly select 1% of the nonzeros in each dataset as a validation set.
Hardware Specification Yes The experiment is conducted on PC with Quad-Core Intel Core i5 CPU @ 1.4GHz and 16GB main memory.
Software Dependencies No The paper mentions implementing methods but does not provide specific version numbers for any software, libraries, or frameworks used (e.g., Python, PyTorch, TensorFlow versions).
Experiment Setup Yes In the inference model, prior parameter (a, b, c) is set to (3, 1, 0.1) on implicit count and (0.3, 0.1, 1) on explicit ratings, respectively. ... we set (g+, h+, g0, h0) to (100, 50, 10, 108) on implicit count and (1, 1, 10, 106) on explicit rating. We empirically set h+ = min( EX+[xui]2 Var X+[θuβi], EX+[xui] 3 ) and g+ = h+ per iteration during the training phase.