Online Learning from Capricious Data Streams: A Generative Approach

Authors: Yi He, Baijun Wu, Di Wu, Ege Beyazit, Sheng Chen, Xindong Wu

IJCAI 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental The experimental results demonstrate that OCDS achieves conspicuous performance on both synthetic and real datasets. and 6 Experiments We use 15 UCI datasets [Dua and Karra Taniskidou, 2017] and 1 real-world IMDB dataset [Maas et al., 2011] to evaluate the performance of OCDS.
Researcher Affiliation Academia 1School of Computing and Informatics, University of Louisiana at Lafayette, USA 2Chongqing Institute of Green and Intelligent Technology, Chinese Academy of Sciences, China
Pseudocode Yes Algorithm 1: Retrieval Strategy and Algorithm 2: The OCDS algorithm
Open Source Code No The paper does not include an explicit statement about releasing source code for the described methodology or a link to a code repository.
Open Datasets Yes We use 15 UCI datasets [Dua and Karra Taniskidou, 2017] and 1 real-world IMDB dataset [Maas et al., 2011] to evaluate the performance of OCDS.
Dataset Splits No The paper does not provide explicit details about training, validation, or test dataset splits (e.g., percentages, sample counts, or specific pre-defined splits).
Hardware Specification No The paper does not provide specific hardware details (e.g., GPU/CPU models, memory) used for running the experiments.
Software Dependencies No The paper does not provide specific software dependencies or library versions used for the experiments.
Experiment Setup Yes To find the best settings of the parameters α, β1 and β2, we use grid searches ranging from 10 5 to 1. For memory and running time efficiency, we let |Ut| 150 by setting γ in different datasets. ... The ratio of the maximal removed features is denoted as VI . For example, VI = 0.5 means that at most 50% of features in xt are randomly removed. The default value of VI is 0.5 in our experiments.