Continual Learning in Linear Classification on Separable Data
Authors: Itay Evron, Edward Moroshko, Gon Buzaglo, Maroun Khriesh, Badea Marjieh, Nathan Srebro, Daniel Soudry
ICML 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | We theoretically study the continual learning of a linear classification model on separable data with binary classes. Even though this is a fundamental setup to consider, there are still very few analytic results on it, since most of the continual learning theory thus far has focused on regression settings. |
| Researcher Affiliation | Academia | 1Department of Electrical and Computer Engineering, Technion, Haifa, Israel 2Toyota Technological Institute at Chicago, Chicago IL, USA. |
| Pseudocode | Yes | Scheme 1 Regularized Continual Learning Initialization: w(λ) 0 = 0D Iterative update for each task t [k]: w(λ) t = argmin w RD (x,y) St e yw x + λt w w(λ) t 1 2 |
| Open Source Code | No | The paper does not provide an explicit statement or link for open-source code availability. |
| Open Datasets | No | The paper refers to 'separable datasets' and 'task sequences' but does not specify or provide access information for any publicly available or open datasets used in its analysis or illustrations. |
| Dataset Splits | No | The paper focuses on theoretical analysis and does not describe empirical experiments requiring training/test/validation dataset splits. |
| Hardware Specification | No | The paper is theoretical and does not mention specific hardware used for any computational work or simulations. |
| Software Dependencies | No | The paper does not mention specific software dependencies with version numbers. |
| Experiment Setup | No | The paper defines theoretical schemes and parameters (e.g., λ, p) for its analysis, but does not provide concrete experimental setup details like hyperparameters or training configurations for empirical validation. |