Fast Online Incremental Learning on Mixture Streaming Data
Authors: Yi Wang, Xin Fan, Zhongxuan Luo, Tianzhu Wang, Maomao Min, Jiebo Luo
AAAI 2017 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Both theoretical analysis and numerical experiments have demonstrated much lower space and time costs (2 10 times faster) than the state of the art, with comparable classification accuracy. Both theoretical analysis and numerical experiments on several benchmark data sets validate the lowest computational costs of FLDA/QR and IFLDA/QR compared with the state-of-the-art batch LDA and incremental LDA algorithms, respectively. |
| Researcher Affiliation | Academia | Yi Wang, Xin Fan, Zhongxuan Luo School of Software Dalian University of Technology & Key Laboratory for Ubiquitous Network and Service Software of Liaoning Province Dalian, China {dlutwangyi, xin.fan, zxluo}@dlut.edu.cn Tianzhu Wang No. 254, Deta Leisure Town Jinzhou New District Dalian, China wangtz@126.com Maomao Min School of Software Dalian University of Technology Dalian, China neilfvhv@gmail.com Jiebo Luo Department of Computer Science University of Rochester Rochester, NY 14627, USA jluo@cs.rochester.edu |
| Pseudocode | Yes | Algorithm 1 FLDA/QR(Cholesky) Input: The data matrix X Rd n. Output: The optimal transformation matrix Gc . Algorithm 2 IFLDA/QR1 (the insertion of new samples to an existing class) Input: The Q and R matrices of the center matrix C of the last step, and labeled new samples X(i) new Rd hi (i (1, . . . , k)). Output: The updated Qc, Rc and Gc. Algorithm 3 IFLDA/QR2 (the insertion of a novel class) Input: The Q and R matrices of the center matrix C for the previous step, and a novel cluster Xnew Rd nnew. Output: The updated Qc, Rc and Gc. Algorithm 4 IFLDA/QR3 (the insertion of chunk data) Input: The Q and R matrices of the old center matrix, labeled new samples X(i) new Rd hi(i [1, . . . , k]) and novel classes Xnewj Rd hj(j [1, . . . , f2]). Output: The updated Qc, Rc and Gc. |
| Open Source Code | Yes | And the matlab codes of the proposed FLDA/QR and IFLDA/QR algorithms can be downloaded from website : https://github.com/dlut-dimt/FLDA-QR-and IFLDA-QR. |
| Open Datasets | Yes | The experiments are conducted on a face image dataset: ORL (Samaria and Harter 1994), an animal feature dataset: AWA (Lampert, Nickisch, and Harmeling 2013) and a text document feature set: Tr23 (derived from TREC collection (http://trec.nist.gov/). Each class of Tr23 is of unequal length). |
| Dataset Splits | Yes | K-nearest neighbor (KNN) method (with K = 1) has been used for classification purpose, and fivefold cross-validation was conducted. |
| Hardware Specification | Yes | The experiments are performed on a 2.5GHZ Intel Core I7 Apple Mac Book Pro with 16GB RAM in MATLAB 2014b environment. |
| Software Dependencies | Yes | The experiments are performed on a 2.5GHZ Intel Core I7 Apple Mac Book Pro with 16GB RAM in MATLAB 2014b environment. |
| Experiment Setup | No | The paper mentions K-nearest neighbor (KNN) with K=1 for classification and fivefold cross-validation, but does not provide specific hyperparameters for the LDA models themselves (e.g., learning rates, batch sizes, epochs, optimizer settings). |