Win-Win: A Privacy-Preserving Federated Framework for Dual-Target Cross-Domain Recommendation

Authors: Gaode Chen, Xinghua Zhang, Yijun Su, Yantong Lai, Ji Xiang, Junbo Zhang, Yu Zheng

AAAI 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Extensive experiments on three real-world datasets demonstrate that P2FCDR significantly outperforms the state-of-the-art methods and effectively protects data privacy.
Researcher Affiliation Collaboration 1 Institute of Information Engineering, Chinese Academy of Sciences, Beijing, China 2 School of Cyber Security, University of Chinese Academy of Sciences, Beijing, China 3 JD i City, JD Technology, Beijing, China 4 JD Intelligent Cities Research, Beijing, China
Pseudocode No The paper does not contain any pseudocode or clearly labeled algorithm blocks.
Open Source Code No The paper does not provide any links or explicit statements about the availability of open-source code for the described methodology.
Open Datasets Yes We study the effectiveness of our P2FCDR on three largest domains on a real-world public dataset Amazon1, i.e., Movies and TV (Movie), Books (Book), and CD Vinyl (Music). 1http://jmcauley.ucsd.edu/data/amazon/
Dataset Splits No Specifically, we held out the latest interaction as the test set and utilized the remaining data for training. The paper does not explicitly mention a separate validation split or how it was derived.
Hardware Specification No The paper does not specify any hardware details such as GPU/CPU models, processors, or memory used for the experiments.
Software Dependencies No The paper mentions using Adam as the optimizer but does not provide specific version numbers for any software dependencies or libraries like Python, PyTorch, or TensorFlow.
Experiment Setup Yes For the representation modeling of users and items, we both use a two-layer fully connected network with dimensions 128 and 128 respectively, and obtain the final embedding dimension k as 128. Considering the trade-off between recommendation performance and privacy protection, we set λ to 0.02. For the learning of gated selecting vector, we use a two-layer fully connected network with dimension 128 and 128, respectively. When training our models, we choose Adam as the optimizer, and set the learning rate to 0.001. Meanwhile, we select a batch of users according to the IDs of the common user to construct minibatches, and set the batch size to 256.