Deep Spectral Kernel Learning
Authors: Hui Xue, Zheng-Fan Wu, Wei-Xiang Sun
IJCAI 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Systematical experiments demonstrate the superiority of DSKN compared to state-of-the-art relevant algorithms on varieties of standard real-world tasks. In this section, we experimentally evaluate the performance of DSKN compared with several state-of-the-art algorithms on varieties of typical tasks, which demonstrates that DSKN can achieve all-round performance improvements. |
| Researcher Affiliation | Academia | Hui Xue1,2 , Zheng-Fan Wu1,2 and Wei-Xiang Sun1,2 1School of Computer Science and Engineering, Southeast University, Nanjing, 210096, China 2MOE Key Laboratory of Computer Network and Information Integration (Southeast University), China {hxue, zfwu, vex-soon}@seu.edu.cn |
| Pseudocode | No | The paper does not contain structured pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not provide any concrete access to source code for the methodology described. |
| Open Datasets | Yes | We firstly conduct classification experiments on four benchmark datasets, including four, ionosphere, splice and wbdc [Blake and Merz, 1998]. Secondly, we conduct regression experiments on other four datasets including airfoil, boston, concrete and energy [Blake and Merz, 1998]. Furthermore, to evaluate the performance of DSKN on training data with different scales, we specifically conduct an image classification experiment on MNIST dataset [Le Cun et al., 1998]. |
| Dataset Splits | Yes | All these data are scaled by z-score standardization and randomly divided into two non-overlapping training and test sets, which are equal in size. Specifically, 10,000 images are randomly selected as test data, and the rest are training data. The training data are further sampled to different scales from 5% to 100%. |
| Hardware Specification | No | The paper does not provide specific hardware details used for running its experiments. |
| Software Dependencies | No | The paper does not provide specific ancillary software details with version numbers. |
| Experiment Setup | Yes | As the priori knowledge, the most widely-used classic Gaussian kernels are used as the internal basic kernel elements of DSKN, whose spectral surfaces S are Gaussian distributions. Moreover, the scales of all deep architectures in the experiments are uniformly set to 1000 500 50. Sigmoid is applied to the activation functions in neural networks. DSKN and the compared kernels are applied to the same Gaussian process models for classification and regression, and optimized by Adam. |