On the Similarity between the Laplace and Neural Tangent Kernels

Authors: Amnon Geifman, Abhay Yadav, Yoni Kasten, Meirav Galun, David Jacobs, Basri Ronen

NeurIPS 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Finally, we provide experiments on real data comparing NTK and the Laplace kernel, along with a larger class of γ-exponential kernels. We show that these perform almost identically.
Researcher Affiliation Academia 1Department of Computer Science, Weizmann Institute of Science, Rehovot, Israel 2Department of Computer Science, University of Maryland, College Park, MD
Pseudocode No The paper does not contain any pseudocode or clearly labeled algorithm blocks in its main text.
Open Source Code No The paper does not contain any explicit statements about releasing source code or direct links to a code repository for the methodology described.
Open Datasets Yes We compare methods using the same set of 90 small scale UCI datasets... Million Songs [10] SUSY [40] HIGGS [40]... We applied the kernel (using the homogeneous versions of the Laplace, Gaussian and γ-exponential kernels) to the Cifar-10 dataset
Dataset Splits Yes We searched for hyperparameters based on a small validation dataset for all the methods and used the standard train/test partition provided on the UCI repository. ... The two columns show results with training on the full dataset and on the first 2000 examples.
Hardware Specification No The paper does not provide specific hardware details (e.g., CPU/GPU models, memory specifications) used for running its experiments.
Software Dependencies No The paper does not provide specific software dependencies, libraries, or their version numbers used in the experiments.
Experiment Setup No The paper mentions that 'Experimental details are provided in the supplementary material' (Section 4) and discusses the number of hyperparameters searched, but it does not provide specific hyperparameter values or detailed system-level training settings in the main text.