Improved Schemes for Episodic Memory-based Lifelong Learning

Authors: Yunhui Guo, Mingrui Liu, Tianbao Yang, Tajana Rosing

NeurIPS 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Extensive experimental results show that the proposed schemes significantly advance the state-of-the-art on four commonly used lifelong learning benchmarks, reducing the error by up to 18%. We conduct experiments on commonly used lifelong learning bencnmarks: Permutated MNIST [4], Split CIFAR [21], Split CUB [2], Split AWA [2].
Researcher Affiliation Academia University of California, San Diego, CA University of Iowa, Iowa City, IA
Pseudocode Yes Algorithm 1 The proposed improved schemes for episodic memory based lifelong learning.
Open Source Code Yes Implementation is available at: https://github.com/yunhuiguo/MEGA
Open Datasets Yes We conduct experiments on commonly used lifelong learning bencnmarks: Permutated MNIST [4], Split CIFAR [21], Split CUB [2], Split AWA [2].
Dataset Splits Yes For each dataset, 17 tasks are used for training and 3 tasks are used for hyperparameter search. In MEGA-I, the is chosen from {10 5:1: 1} via the 3 validation tasks.
Hardware Specification Yes All the experiments are done on 8 NVIDIA TITAN RTX GPUs.
Software Dependencies No The paper does not provide specific version numbers for software components (e.g., programming languages, libraries, or frameworks) used in the experiments.
Experiment Setup Yes the episodic memory size for each task is 250, 65, 50, and 100, and the batch size for computing the gradients on the episodic memory (if needed) is 256, 256, 128 and 128 for MNIST, CIFAR, CUB and AWA, respectively. In MEGA-I, the is chosen from {10 5:1: 1} via the 3 validation tasks. For Permuted MNIST we adopt a standard fullyconnected network with two hidden layers. Each layer has 256 units with Re LU activation. For Split CIFAR we use a reduced Res Net18. For Split CUB and Split AWA, we use a standard Res Net18 [47].