Robust Principal Component Analysis with Side Information
Authors: Kai-Yang Chiang, Cho-Jui Hsieh, Inderjit Dhillon
ICML 2016 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | In addition, we conduct synthetic experiments as well as a real application on noisy image classification to show that our method also improves the performance in practice by exploiting side information. |
| Researcher Affiliation | Academia | Kai-Yang Chiang? KYCHIANG@CS.UTEXAS.EDU Cho-Jui Hsieh CHOHSIEH@UCDAVIS.EDU Inderjit S. Dhillon? INDERJIT@CS.UTEXAS.EDU ?Department of Computer Science, The University of Texas at Austin, Austin, TX 78712, USA Department of Statistics and Computer Science, University of California at Davis, Davis, CA 95616, USA |
| Pseudocode | Yes | Algorithm 1 ALM method for PCPF |
| Open Source Code | No | The paper mentions using a third-party ALM solver for PCP but does not provide any link or statement about the availability of their own code for PCPF. |
| Open Datasets | Yes | We consider the digit recognition dataset MNIST, which includes 50,000 training images and 10,000 testing images, and each image is a handwriting digit described as a 784 dimensional vector. |
| Dataset Splits | No | The paper mentions training and testing sets for MNIST but does not specify a validation set or explicit split percentages for all three. |
| Hardware Specification | No | The paper does not provide specific details about the hardware used for experiments. |
| Software Dependencies | No | The paper mentions using LIBLINEAR and LIBSVM with citations but does not provide specific version numbers for these or any other software dependencies. |
| Experiment Setup | Yes | The parameters λ in both PCP and PCPF are set as 1/pn by default as Theorem 1 suggested. The convergence criterion is set to be k R S XHY T k F /k Rk F < 10 7 as suggested in Cand es et al. (2011). |