Automated Spectral Kernel Learning

Authors: Jian Li, Yong Liu, Weiping Wang4618-4625

AAAI 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental In this section, compared with other algorithms, we evaluate the empirical behavior of our proposed algorithm ASKL on several benchmark datasets to demonstrate the effects of factors used in our algorithm, including the non-stationary spectral kernel, updating spectral density with backpropagation and additional regularization terms.Extensive experimental results validate the effectiveness of the proposed algorithm and coincide with our theoretical findings.
Researcher Affiliation Academia 1Institute of Information Engineering, Chinese Academy of Sciences 2School of Cyber Security, University of Chinese Academy of Sciences {lijian9026, liuyong, wangweiping}@iie.ac.cn
Pseudocode No The paper describes the algorithm steps mathematically and in text but does not include a formal pseudocode block or algorithm listing.
Open Source Code No The paper mentions implementing algorithms based on Pytorch but does not provide any concrete access information (e.g., URL or explicit statement of code release) for its methodology.
Open Datasets Yes We evaluate the performance of the proposed learning framework ASKL and compared algorithms based on several publicly available datasets, including both classification and regression tasks.MNIST dataset (Le Cun et al. 1998).
Dataset Splits Yes by using 5-folds cross-validation and grid search over parameters candidate sets.To obtain stable results, we run methods on each dataset 30 times with randomly partition such that 80% data for training and 20% data for testing.
Hardware Specification No The paper does not provide specific details about the hardware (e.g., CPU/GPU models, memory) used for running the experiments.
Software Dependencies No We implement all algorithms based on Pytorch and use Adam as optimizer with 32 examples in a mini-batch to solve the minimization problem.
Experiment Setup Yes Regularization parameters are selected in λ1, λ2 {10 10, 10 9, , 10 1} and Gaussian kernel parameter σ is selected from candidate set σ {2 10, , 210}.We implement all algorithms based on Pytorch and use Adam as optimizer with 32 examples in a mini-batch to solve the minimization problem.