Kernel Continual Learning
Authors: Mohammad Mahdi Derakhshani, Xiantong Zhen, Ling Shao, Cees Snoek
ICML 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Extensive evaluation on four benchmarks demonstrates the effectiveness and promise of kernels for continual learning. |
| Researcher Affiliation | Academia | 1AIM Lab, University of Amsterdam, The Netherlands 2Inception Institute of Artificial Intelligence, UAE. Correspondence to: M. Derakhshani <m.m.derakhshani@uva.nl>, X. Zhen <x.zhen@uva.nl>. |
| Pseudocode | No | The paper does not contain any structured pseudocode or algorithm blocks. |
| Open Source Code | Yes | All our code will be released.1 [footnote 1: https://github.com/mmderakhshani/KCL] |
| Open Datasets | Yes | We perform experiments on four benchmark datasets: Rotated MNIST, Permuted MNIST, Split CIFAR100 and mini Image Net. |
| Dataset Splits | No | The paper does not explicitly provide specific training/validation/test split percentages or counts for the datasets used. |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., GPU/CPU models, memory) used for running the experiments. |
| Software Dependencies | No | The model is implemented in Pytorch (Paszke et al., 2019). (Mentions PyTorch but without a specific version number, and no other versioned dependencies). |
| Experiment Setup | Yes | For the Permuted MNIST and Rotated MNIST benchmarks, hθ contains only two hidden layers, each of which has 256 neurons, followed by a Re LU activation function... With regard to the fγ and fφ networks, we adopt three hidden layers followed by an ELU activation function... On Permuted MNIST and Rotated MNIST, there are 256 neurons per layer, and we use 160 and 512 for Split CIFAR100 and mini Image Net, respectively. For fair comparisons, the model is trained for only one epoch per task, that is, each sample in the dataset is observed only once. The batch size is set to 10. |