Split-and-Bridge: Adaptable Class Incremental Learning within a Single Neural Network
Authors: Jong-Yeong Kim, Dong-Wan Choi8137-8145
AAAI 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | In our thorough experimental analysis, our Split-and-Bridge method outperforms the state-of-the-art competitors in KD-based continual learning. |
| Researcher Affiliation | Academia | Jong-Yeong Kim, Dong-Wan Choi Department of Computer Science and Engineering, Inha University, South Korea kjy93217@naver.com, dchoi@inha.ac.kr |
| Pseudocode | Yes | Algorithm 1: Split-and-Bridge Incremental Learning |
| Open Source Code | Yes | 2https://github.com/bigdata-inha/Split-and-Bridge |
| Open Datasets | Yes | In our experiments, we train two benchmark datasets, CIFAR-100 (Krizhevsky, Hinton et al. 2009) and Tiny-Image Net (Le and Yang 2015) |
| Dataset Splits | Yes | Tiny-Image Net includes 100K training images and 10K validation images for 200 classes |
| Hardware Specification | Yes | We implement all the methods2 in Py Torch, and train each model on a machine with an NVIDIA TITAN RTX and Intel Core Xeon Gold 5122. |
| Software Dependencies | No | The paper mentions 'Py Torch' but does not specify a version number or other software dependencies with versions. |
| Experiment Setup | Yes | In the split phase, we divide two last residual blocks along with the final fully-connected (FC) layer of Res Net-18 into two disjoint partitions, i.e., θo and θn, implying S = 13 and L = 18. Full details are covered in our supplementary material. |