Error Analysis of Generalized Nyström Kernel Regression
Authors: Hong Chen, Haifeng Xia, Heng Huang, Weidong Cai
NeurIPS 2016 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experimental analysis demonstrates the satisfactory performance of GNKR with the column norm sampling. |
| Researcher Affiliation | Academia | Hong Chen Computer Science and Engineering University of Texas at Arlington Arlington, TX, 76019 chenh@mail.hzau.edu.cn; Haifeng Xia Mathematics and Statistics Huazhong Agricultural University Wuhan 430070,China haifeng.xia0910@gmail.com; Weidong Cai School of Information Technologies University of Sydney NSW 2006, Australia tom.cai@sydney.edu.au; Heng Huang Computer Science and Engineering University of Texas at Arlington Arlington, TX, 76019 heng@uta.edu |
| Pseudocode | No | The paper describes mathematical formulations and algorithms in prose and equations (e.g., equations (1) to (5)), but it does not include a distinct pseudocode block or algorithm box. |
| Open Source Code | No | The paper does not contain any explicit statement about releasing source code or providing a link to a code repository. |
| Open Datasets | Yes | Wine Quality, CASP, Year Prediction datasets (http://archive.ics.uci.edu/ml/) and the census-house dataset (http://www.cs.toronto.edu/ delve/data/census-house/desc.html). |
| Dataset Splits | No | For the training samples, the output y is contaminated by Gaussian noise N(0, 1). For each function and each kernel, we run the experiment 20 times. The paper mentions splitting data into training and testing parts but does not explicitly describe a separate validation set or its split. |
| Hardware Specification | No | The paper does not provide specific details about the hardware used for running the experiments (e.g., CPU, GPU models, memory, or cloud instances). |
| Software Dependencies | No | The paper mentions types of kernels used (Gaussian kernel, Epanechnikov kernel) and notes that for the output y is contaminated by Gaussian noise N(0,1), but it does not specify any software dependencies or their version numbers (e.g., Python, TensorFlow, PyTorch, scikit-learn versions). |
| Experiment Setup | Yes | Gaussian kernel KG(x, t) = exp x t 2 2 2σ2 is used for simulated data and real data. Epanechnikov kernel KE(x, t) = 1 x t 2 2 2σ2 + is used in the simulated experiment. Here, σ denotes the scale parameter selected form [10 5 : 10 : 104]. Following the discussion on parameter selection in [16], we select the regularization parameter of GNKR from [10 15 : 10 : 10 3]. The best results are reported according to the measure of Root Mean Squared Error (RMSE). |