Personalized Time-Aware Tag Recommendation
Authors: Keqiang Wang, Yuanyuan Jin, Haofen Wang, Hongwei Peng, Xiaoling Wang
AAAI 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | The experimental results show that our proposed model outperforms the state of the art tag recommendation methods in accuracy and has better ability to recommend new tags. |
| Researcher Affiliation | Collaboration | 1International Research Center of Trustworthy Software Shanghai Key Laboratory of Trustworthy Computing East China Normal University, Shanghai, China 2 Shenzhen Gowild Robotics Co. Ltd |
| Pseudocode | Yes | Algorithm 1: An Optimization Algorithm for TAPITF. |
| Open Source Code | No | The paper does not explicitly state that the source code for their methodology is released, nor does it provide a link to a code repository. |
| Open Datasets | Yes | We evaluate the models on the three public data sets Movielens, Last FM and Delicious described in table 1. |
| Dataset Splits | No | We use leave-one-out to split data set into train set and test set, which is that for each user, his tagging records on a certain item are randomly removed from the training set Strain and put into the test set St. |
| Hardware Specification | No | The paper discusses 'running time' (Table 3) but does not provide any specific details about the hardware used for the experiments, such as CPU/GPU models or memory specifications. |
| Software Dependencies | No | The paper does not provide specific software dependencies with version numbers (e.g., Python 3.x, PyTorch 1.x). |
| Experiment Setup | Yes | Latent factor dimension K = 64, regularization factor λ = 0.00005 and learning rate is 0.05. In TAPITF, d = 0.5, time unit is day. Latent factor dimension and regularization factor is the same as PITF. The iteration number of PITF and TAPITF are both 100. |