Dynamicity-aware Social Bot Detection with Dynamic Graph Transformers
Authors: Buyun He, Yingguang Yang, Qi Wu, Hao Liu, Renyu Yang, Hao Peng, Xiang Wang, Yong Liao, Pengyuan Zhou
IJCAI 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experimental results demonstrate the superiority of Bot DGT against the leading methods that neglected the dynamic nature of social networks in terms of accuracy, recall, and F1-score. |
| Researcher Affiliation | Academia | 1University of Science and Technology of China 2Beihang University 3Harbin Engineering University 4Aarhus university |
| Pseudocode | No | The paper describes its methodology in prose and mathematical equations but does not include pseudocode or an algorithm block. |
| Open Source Code | Yes | Our code is publicly available on Git Hub1. 1https://github.com/Peien429/Bot DGT |
| Open Datasets | Yes | We conduct experiments on two comprehensive social bot detection benchmarks: Twi Bot-20 [Feng et al., 2021a] and Twi Bot-22 [Feng et al., 2022b]. |
| Dataset Splits | No | The paper mentions using Twi Bot-20 and Twi Bot-22 datasets but does not explicitly detail the training, validation, and test splits (e.g., percentages or sample counts). |
| Hardware Specification | No | The paper does not provide any specific hardware details such as GPU/CPU models, memory, or cloud instance types used for running the experiments. |
| Software Dependencies | No | The paper does not specify the version numbers of any software dependencies or libraries (e.g., Python, PyTorch, TensorFlow, or specific library versions) used for implementation or experimentation. |
| Experiment Setup | No | The paper describes the model architecture and general experimental setup, but it does not provide specific hyperparameter values (e.g., learning rate, batch size, number of epochs) or detailed training configurations. |