Robust Kernel Dictionary Learning Using a Whole Sequence Convergent Algorithm
Authors: Huaping Liu, Jie Qin, Hong Cheng, Fuchun Sun
IJCAI 2015 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | In this section we present several experimental results demonstrating the effectiveness of the proposed robust kernel dictionary learning. |
| Researcher Affiliation | Academia | Huaping Liu1,2, , Jie Qin1,2, Hong Cheng3, Fuchun Sun1,2 1Department of Computer Science and Technology, Tsinghua University, Beijing, China 2State Key Lab. of Intelligent Technology and Systems, TNLIST, Beijing, China 3Center for Robotics, University of Electronic Science and Technology of China, Chengdu, China hpliu@tsinghua.edu.cn |
| Pseudocode | No | The paper describes the algorithm steps (Calculating C, A, R) but does not present them in a formal pseudocode block or an explicitly labeled "Algorithm" section. |
| Open Source Code | No | The paper does not provide any explicit statement or link indicating that the source code for the described methodology is publicly available. |
| Open Datasets | Yes | USPS digit recognition[Nguyen et al., 2013]: This dataset contains 10 classes of 256-dimensional handwritten digits. Virus image classification[Harandi and Salzmann, 2015]: This dataset includes 15 different classes. Kylberg texture classification[Kylberg, 2011]: This dataset includes 28 classes. UCMERCED scene classification[Yang and Newsam, 2010]: This dataset includes 21 challenging scene categories with 100 samples per class. |
| Dataset Splits | No | The paper specifies training and testing splits (e.g., "N = 500 samples for training and 200 samples for testing") but does not explicitly mention a separate validation split or how parameters were tuned on a validation set. |
| Hardware Specification | No | The paper does not provide any specific details about the hardware used to run the experiments, such as CPU or GPU models. |
| Software Dependencies | No | The paper does not provide specific software dependencies, libraries, or solvers with version numbers that would be needed to replicate the experiment. |
| Experiment Setup | Yes | For the proposed method, we solve the optimization problem in Eq.(2) with λ1 = 0.01 and λ2 = 0.001. ... As to the proposed method, we fix the regularization parameters λ1 = 0.001 and λ2 = 0.0001 and the maximum iteration number is 100. |