Total Least Squares Regression in Input Sparsity Time
Authors: Huaian Diao, Zhao Song, David Woodruff, Xin Yang
NeurIPS 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We empirically validate our algorithm on real and synthetic data sets. |
| Researcher Affiliation | Academia | Huaian Diao Northeast Normal University & KLAS of MOE hadiao@nenu.edu.cn Zhao Song University of Washington zhaosong@uw.edu David P. Woodruff Carnegie Mellon University dwoodruf@cs.cmu.edu Xin Yang University of Washington yx1992@cs.washington.edu |
| Pseudocode | Yes | Algorithm 1 Our Fast Total Least Squares Algorithm |
| Open Source Code | Yes | 1The code can be found at https://github.com/yangxinuw/total_least_squares_code. |
| Open Datasets | Yes | We also conducted experiments on real datasets from the UCI Machine Learning Repository [DKT17]. ... We have four real datasets : Airfoil Self-Noise [UCIa] in Table 2(a), Wine Quality Red wine [UCIc, CCA+09] in Table 2(b), Wine Quality White wine [UCIc, CCA+09] in Table 2(c), Insurance Company Benchmark (COIL 2000) Data Set [UCIb, PS] |
| Dataset Splits | No | The paper discusses synthetic data and real datasets from UCI, and specifies 'sample density' for sketching, but does not provide explicit training/test/validation split percentages or sample counts for the datasets used in experiments. |
| Hardware Specification | Yes | Our numerical tests are carried out on an Intel Xeon E7-8850 v2 server with 2.30GHz and 4GB RAM |
| Software Dependencies | Yes | under Matlab R2017b. |
| Experiment Setup | Yes | In the experiments, we take sample density ρ = 0.1, 0.3, 0.6, 0.9 respectively to check our performance. |