Kernel Functional Optimisation

Authors: Arun Kumar Anjanapura Venkatesh, Alistair Shilton, Santu Rana, Sunil Gupta, Svetha Venkatesh

NeurIPS 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Our experimental results on GP regression and Support Vector Machine (SVM) classification tasks involving both synthetic functions and several real-world datasets show the superiority of our approach over the state-of-the-art.
Researcher Affiliation Academia Arun Kumar A V, Alistair Shilton, Santu Rana, Sunil Gupta, Svetha Venkatesh Applied Artificial Intelligence Institute (A2I2), Deakin University Waurn Ponds, Geelong, Australia {aanjanapuravenk, alistair.shilton, santu.rana, sunil.gupta, svetha.venkatesh}@deakin.edu.au
Pseudocode Yes A complete algorithm for the Kernel Functional Optimisation (KFO) is given by Algorithm 1.
Open Source Code Yes The code base used for the experiments mentioned above is available at https://github.com/mailtoarunkumarav/Kernel Functional Optimisation
Open Datasets Yes In our classification and regression experiments, we use the publicly available multi-dimensional real-world datasets from the UCI repository (Dua and Graff, 2017).
Dataset Splits Yes We perform 10-fold cross-validation on the training data set containing 80% of the total instances and tune the cost parameter (C) of the SVM in the exponent space of [ 3, 3].
Hardware Specification Yes The aforesaid runtimes are measured on a server with Intel Xeon processor having 16 GB of RAM.
Software Dependencies No The paper does not provide specific ancillary software details with version numbers (e.g., library or solver names with version numbers like Python 3.8, CPLEX 12.4) needed to replicate the experiment.
Experiment Setup Yes We have considered the following experimental settings for KFO. We have used Matérn Harmonic Hyperkernel (Eq. (3)) to define the space of kernel functionals. To express the kernel as kernel functional in Hyper-RKHS, we consider Ng 10 n for a given n dimensional problem. The outer-loop representing the number of low-dimensional subspace searches (S) to find the best kernel function is restricted to S = 5 and the number of iterations (T) in each of the subspace (inner-loop) is restricted to T = 20. We use GP-UCB acquisition function to guide the search for optimum in all our experiments and at all levels. The hyperparameters λh and l of the hyperkernel (Eq. (3)) are tuned in the interval (0, 1] using a standard BO procedure mentioned in the supplementary material.