Network Diffusions via Neural Mean-Field Dynamics

Authors: Shushan He, Hongyuan Zha, Xiaojing Ye

NeurIPS 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Empirical study shows that our approach is versatile and robust to variations of the underlying diffusion network models, and significantly outperforms existing approaches in accuracy and efficiency on both synthetic and real-world data.
Researcher Affiliation Academia Shushan He Mathematics & Statistics Georgia State University Atlanta, Georgia, USA she4@gsu.edu Hongyuan Zha School of Data Science Shenzhen Research Institute of Big Data, CUHK, Shenzhen, China zhahy@cuhk.edu.cn Xiaojing Ye Mathematics & Statistics Georgia State University Atlanta, Georgia, USA xye@gsu.edu
Pseudocode Yes Algorithm 1 Neural mean-field (NMF) algorithm for network inference and influence estimation
Open Source Code Yes Our numerical implementation of NMF is available at https://github.com/Shushan He/neural-mf.
Open Datasets Yes We generate training data consists of K=10,000 cascades, which is formed by 10 sample cascades for each of 1,000 source sets (a source set is generated by randomly selecting 1 to 10 nodes from the network). All networks and cascades are generated by SNAP [29]. Our numerical implementation of NMF is available at https://github.com/Shushan He/neural-mf. ... We also tested NMF on a real dataset [54] from Sina Weibo social platform...
Dataset Splits No No specific dataset split information (exact percentages, sample counts, citations to predefined splits, or detailed splitting methodology for train/validation/test sets) was provided for reproduction purposes. While training and testing data are mentioned, a dedicated validation set or precise split ratios are not clearly specified.
Hardware Specification No No specific hardware details (exact GPU/CPU models, processor types, memory amounts, or detailed computer specifications) used for running experiments were provided.
Software Dependencies No The paper does not provide specific software dependencies with version numbers, such as library or solver names with their corresponding versions (e.g., Python 3.8, TensorFlow 2.x).
Experiment Setup Yes For Influ Learner, we set 128 as the feature number for optimal accuracy as suggested in [12]. For LSTM, we use one LSTM block and a dense layer for each t. ... For each distribution, we draw αji from Unif[0.1,1] to simulate the varying interactions between nodes.