Adversarial Socialbots Modeling Based on Structural Information Principles
Authors: Xianghua Zeng, Hao Peng, Angsheng Li
AAAI 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Extensive and comparative experiments on both homogeneous and heterogeneous social networks demonstrate that, compared with state-of-the-art baselines, the proposed SIASM framework yields substantial performance improvements in terms of network influence (up to 16.32%) and sustainable stealthiness (up to 16.29%) when evaluated against a robust detector with 90% accuracy. |
| Researcher Affiliation | Academia | 1 State Key Laboratory of Software Development Environment, Beihang University, Beijing, China 2 Zhongguancun Laboratory, Beijing, China {zengxianghua, penghao, angsheng}@buaa.edu.cn, liangsheng@gmail.zgclab.edu.cn |
| Pseudocode | Yes | Algorithm 1: The Optimization Algorithm |
| Open Source Code | Yes | Furthermore, all source codes and experimental results are available at an anonymous link1. 1https://github.com/SELGroup/SIASM |
| Open Datasets | Yes | For heterogeneous network analysis, we use the latest Higgs Twitter Dataset (De Domenico et al. 2013), which includes directed multi-relational interactions. Like other works (Le, Tran-Thanh, and Lee 2022), we select 10% of the real-life networks to construct synthetic stochastic networks as the training set and take the remaining 90% of the collected networks as the testing set. |
| Dataset Splits | No | The paper states 'we select 10% of the real-life networks to construct synthetic stochastic networks as the training set and take the remaining 90% of the collected networks as the testing set.' It does not explicitly mention a separate validation set. |
| Hardware Specification | No | The paper does not provide specific details about the hardware (e.g., GPU/CPU models, memory) used for running the experiments. |
| Software Dependencies | No | The paper does not provide specific software dependencies with version numbers (e.g., Python 3.x, PyTorch 1.x, CUDA x.x). |
| Experiment Setup | Yes | During the training process on synthetic graphs, we use a default propagation probability (p) of 0.8 and a maximal episode length (Tmax) of 120. In this work, we set the ratio of filtered user vertices and the height of pruned subtrees as 5% and 1, respectively. |