Evolving Parameterized Prompt Memory for Continual Learning

Authors: Muhammad Rifki Kurniawan, Xiang Song, Zhiheng Ma, Yuhang He, Yihong Gong, Yang Qi, Xing Wei

AAAI 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Comprehensive experiments validate that our approach achieves state-of-the-art performance in both class and domain incremental learning scenarios. Source code is available at https://github.com/MIV-XJTU/Evo Prompt.
Researcher Affiliation Academia 1School of Software Engineering, Xi an Jiaotong University 2College of Artificial Intelligence, Xi an Jiaotong University 3Shenzhen Institute of Advanced Technology, Chinese Academy of Sciences 4School of Computer Science and Technology, Xi an Jiaotong University
Pseudocode No The paper includes diagrams and textual descriptions of its method but no formal pseudocode or algorithm blocks.
Open Source Code Yes Source code is available at https://github.com/MIV-XJTU/Evo Prompt.
Open Datasets Yes Our evaluation encompasses Split CIFAR-100 (Krizhevsky 2009) and Split Image Net R (Hendrycks et al. 2021) for CIL, and CORe50 (Lomonaco and Maltoni 2017) for DIL, maintaining original sequential class order.
Dataset Splits No The paper describes training and testing datasets, but does not explicitly provide details about a validation dataset split or a cross-validation setup.
Hardware Specification No The paper does not specify any particular hardware (e.g., GPU/CPU models, memory) used for running the experiments.
Software Dependencies No The paper mentions using the 'Adam optimizer' but does not specify version numbers for any software dependencies like programming languages, libraries, or frameworks (e.g., Python, PyTorch, TensorFlow, CUDA).
Experiment Setup Yes Training involves 224 input sizes and a batch size of 64. Our approach uses Adam optimizer (Kingma and Ba 2015) with a constant learning rate (lr) of 0.003 for 5 epochs on CORe50, lr 0.05 for 20 epochs on Split CIFAR-100 and 50 epochs on Image Net-R.