Zero-Shot Rumor Detection with Propagation Structure via Prompt Learning

Authors: Hongzhan Lin, Pengyao Yi, Jing Ma, Haiyun Jiang, Ziyang Luo, Shuming Shi, Ruifang Liu

AAAI 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Extensive experiments conducted on three real-world datasets demonstrate that our proposed model achieves much better performance than state-of-the-art methods and exhibits a superior capacity for detecting rumors at early stages.
Researcher Affiliation Academia Hong Kong Baptist University 2Beijing University of Posts and Telecommunications 3Fudan University 4Tsinghua University
Pseudocode No The paper describes the approach in detail but does not provide any pseudocode or algorithm blocks.
Open Source Code Yes Our code and resources will be available at https://github.com/ Pengyao Yi/zero Rumor AAAI
Open Datasets Yes We utilize FOUR public datasets TWITTER, WEIBO (Ma et al. 2016), Twitter-COVID19 and Weibo-COVID19 (Lin et al. 2022) for experiments.
Dataset Splits No The paper mentions 'Early stopping (Yao, Rosasco, and Caponnetto 2007) is applied to avoid overfitting', which implies the use of a validation set. However, it does not provide specific details on the split percentages, sample counts, or methodology for the validation set.
Hardware Specification No The paper does not specify the hardware used for running experiments (e.g., specific GPU or CPU models, memory, or cloud computing instances with detailed specifications).
Software Dependencies No The paper mentions the use of 'multilingual PLMs' and 'Adam W optimizer' but does not provide specific version numbers for these or other software dependencies (e.g., Python, PyTorch, TensorFlow versions).
Experiment Setup Yes We set the layer number k of the Syn Encoder as 6. The learning rate is initialized as 1e-5. Early stopping (Yao, Rosasco, and Caponnetto 2007) is applied to avoid overfitting.