Group Sparse Bayesian Learning for Active Surveillance on Epidemic Dynamics

Authors: Hongbin Pei, Bo Yang, Jiming Liu, Lei Dong

AAAI 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental The efficacy of the proposed algorithm is theoretically analyzed and empirically validated using both synthetic and real-world data.Validations and comparisons were performed on both synthetic and real-world data, which show that the proposed method outperforms existing methods.
Researcher Affiliation Academia Hongbin Pei,1,2 Bo Yang,1,2 Jiming Liu,3 Lei Dong4 1College of Computer Science and Technology, Jilin University, Changchun, China 2Key Laboratory of Symbolic Computation and Knowledge Engineering of Ministry of Education, China 3Department of Computer Science, Hong Kong Baptist University, Hong Kong 4Institute of Remote Sensing and Geographical Information Systems, Peking University, Beijing, China
Pseudocode Yes Algorithm 1: SNMA
Open Source Code No The paper does not provide any explicit statements about making its source code available, nor does it include a link to a code repository.
Open Datasets Yes The cases report data of 2009 Hong Kong H1N1 influenza epidemic, was provided by Centre for Health Protection (CHP), Department of Health, Government of the Hong Kong Special Administrative Region.Five years (2005-2009) monthly malaria cases data at the town level, were collected by Tengchong CDC and can be obtained from the annual reports of National Institute of Parasitic Disease, China CDC.Baidu Tieba (tieba.baidu.com), one of the largest online community platforms in China, is a collection of thousands of active topic-specific communities.
Dataset Splits Yes We adopt a 5-fold cross-validation strategy in experiments on synthetic data.We set the dynamics from Jun.1 to Aug.15 as training data and the one from Aug. 15 to Sep. 15 as test data.We set the malaria dynamics from 2005 to 2008 as training data and the one during 2009 as test data.Here, we split the training and test data in term of the hot words, i.e., we alternatively set the dynamics of one hot word as test data and the rest be the training data.
Hardware Specification Yes Fig. 4 presents the average running time of one iteration of the three methods on a PC with a 3.4GHz CPU and 8GB memory.
Software Dependencies No The paper does not provide specific version numbers for any software components or libraries used in the experiments.
Experiment Setup No The paper describes general environment settings like data volume and noise levels for synthetic data, and mentions random initialization of some parameters in Algorithm 1. However, it does not provide specific hyperparameter values (e.g., learning rates, batch sizes, exact initialization values) or detailed system-level training configurations to reproduce the experiments.