Active Lifelong Learning With “Watchdog”

Authors: Gan Sun, Yang Cong, Xiaowei Xu

AAAI 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Extensive experiments on both benchmark datasets and our own dataset demonstrate the effectiveness of our proposed model especially in task selection and dictionary learning.
Researcher Affiliation Academia 1State Key Laboratory of Robotics, Shenyang Institute of Automation, Chinese Academy of Sciences, China. 2University of Chinese Academy of Sciences, China. 3Department of Information Science, University of Arkansas at Little Rock, USA.
Pseudocode Yes Algorithm 1 Active Lifelong Learning (Ac LL) and Algorithm 2 Online Dictionary Update
Open Source Code No The paper does not provide any explicit statement or link for the open-source code of the described methodology.
Open Datasets Yes We use six benchmark datasets for our experiments: London School Data (School1)... http://cvn.ecp.fr/personnel/andreas/code/mtl/index.html Parkinson Data2... https://archive.ics.uci.edu/ml/datasets/parkinsons+ telemonitoring Yeast Data3... http://mulan.sourceforge.net/datasets-mlc.html Smart Meter Data4... http://www.ucd.ie/issda/data/commissionforenergyregulation cer/
Dataset Splits No The paper states "For each task, we randomly split 50%-50% train-test set for our experiments" but does not explicitly mention a separate validation split or specify detailed splitting methodology beyond a 50-50 train-test split.
Hardware Specification Yes All the experiments are performed using Matlab on the computer with 12G RAM, i7 CPU.
Software Dependencies No The paper states "All the experiments are performed using Matlab" but does not specify the version number of Matlab or any other software dependencies with their versions.
Experiment Setup Yes Input: λ1 > 0, λ2 > 0, μ1 > 0, μ2 > 0, t = 0. (from Algorithm 1) and Additionally, we also evaluate the effect of local dictionary ΘA by fixing λ1 = 1 and adjusting λ2 in [0.001, 0.01, 0.1, 1, 10, 100, 1000].