Learning to Recommend from Sparse Data via Generative User Feedback
Authors: Wenlin Wang4436-4444
AAAI 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experimental results show that the proposed framework is able to enrich the learning of user preference and boost the performance of existing collaborative filtering methods on multiple datasets. |
| Researcher Affiliation | Academia | Wenlin Wang Department of Electrical and Computer Engineering, Duke University wlwang616@gmail.com |
| Pseudocode | No | The paper describes the learning algorithm in text and mathematical formulations but does not include explicit pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not include any statement or link indicating that the source code for the described methodology is publicly available. |
| Open Datasets | Yes | We investigate the effectiveness of the proposed CF-SFL framework on three benchmark datasets of recommendation systems. (i) Movie Lens-20M (ML-20M)... (ii) Netflix-Prize (Netflix)... (Bennett, Lanning et al. 2007); (iii) Million Song Dataset (MSD)... (Bertin-Mahieux et al. 2011). |
| Dataset Splits | Yes | Figure 3: Performance (NDCG@100) boost on the validation sets. |
| Hardware Specification | No | The paper does not provide specific hardware details such as GPU/CPU models, processor types, or memory amounts used for running experiments. |
| Software Dependencies | No | The paper mentions the use of 'Adam optimizer' but does not specify any software names with version numbers (e.g., programming languages, libraries, or frameworks). |
| Experiment Setup | Yes | To learn the model, we pre-train the recommender (150 epochs for ML-20M and 75 epochs for Netflix and MSD) and optimize the entire framework (50 epochs for ML-20M and 25 epochs for the other two). ℓ2 regularization with a penalty term 0.01 is applied to the recommender, and Adam optimizer (Kingma and Ba 2014) with batch in size of 500 is employed. |