Memory Augmented Neural Model for Incremental Session-based Recommendation

Authors: Fei Mi, Boi Faltings

IJCAI 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Through extensive experiments, we show that MAN boosts the performance of different neural methods and achieves state-of-the-art.
Researcher Affiliation Academia Fei Mi and Boi Faltings Artificial Intelligence Laboratory, Ecole Polytechnique F ed erale de Lausanne (EPFL) fei.mi@epfl.ch, boi.faltings@epfl.ch
Pseudocode Yes Algorithm 1 Memory Augmented Neural Recommender
Open Source Code No The paper mentions using the FAISS open-source library (https://github.com/facebookresearch/faiss) but does not state that the code for the described methodology (MAN) is open-source or provided.
Open Datasets Yes YOOCHOOSE: This is a public dataset for Rec Sys Challenge 2015.2 It contains click-streams on an e-commerce site over 6 months. 2http://2015.recsyschallenge.com/challenge.html DIGINETICA: This dataset contains click-streams data on another e-commerce site over a span of 5 months for CIKM Cup 2016.3 3http://cikm2016.cs.iupui.edu/cikm-cup
Dataset Splits Yes The last 10% of the training data based on time is used as the validation set.
Hardware Specification Yes Both models are trained using a NVIDIA TITAN X GPU with 12GB memory.
Software Dependencies No The paper mentions using the "FAISS open-source library" but does not specify a version number or other software dependencies with versions.
Experiment Setup Yes During training, hidden layer size of GRU4Rec and NARM is set to 100, and the item embedding size of NARM is set to 50; the batch size is set to 512 and 30 epochs are trained for both models. ... Learning rates for neural models are 5e-4 and 1e-4 for YOOCHOOSE and DIGINETICA, and the learning rate to update the gating network is 1e-3. ... The number of nearest neighbors of MAN is set to 50 for YOOCHOOSE and 100 for DIGINETICA.