Towards Practical Alternating Least-Squares for CCA
Authors: Zhiqiang Xu, Ping Li
NeurIPS 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experiments on several datasets empirically demonstrate the superiority of the proposed algorithms to several recent variants of CCA solvers. |
| Researcher Affiliation | Industry | Zhiqiang Xu and Ping Li Cognitive Computing Lab Baidu Research No.10 Xibeiwang East Road, Beijing, 10085, China 10900 NE 8th St, Bellevue, WA 98004, USA {xuzhiqiang04,liping11}@baidu.com |
| Pseudocode | Yes | Algorithm 1 TALS-CCA, Algorithm 2 FALS-CCA, Algorithm 3 AALS-CCA |
| Open Source Code | No | The paper states 'All the algorithms were implemented in MATLAB,' but does not provide any explicit statement about making the source code open or available, nor does it provide a link to a code repository. |
| Open Datasets | Yes | Three real-world datasets are used: Mediamill [18], JW11 [17], and MNIST [14]. |
| Dataset Splits | No | The paper mentions using three real-world datasets (Mediamill, JW11, MNIST) and describes solver iterations, but it does not specify any explicit train/validation/test dataset splits (e.g., percentages, sample counts, or references to predefined splits). |
| Hardware Specification | Yes | All the algorithms were implemented in MATLAB, and run on a laptop with 8 GB memory. |
| Software Dependencies | No | The paper states 'All the algorithms were implemented in MATLAB,' but does not provide specific version numbers for MATLAB or any other software dependencies. |
| Experiment Setup | Yes | Regularization parameters are fixed to rx = ry = 0.1. Stochastic variance reduced gradient (SVRG) is the least-squares solver we use for each algorithm. Throughout the experiments the solver runs 2 epochs with each running n iterations with constant step-sizes αΦ = 1 maxi xi 2 2 for Φt and αΨ = 1 maxi yi 2 2 for Ψt, where xi is the i-th column of X. |