Notice: The reproducibility variables underlying each score are classified using an automated LLM-based pipeline, validated against a manually labeled dataset. LLM-based classification introduces uncertainty and potential bias; scores should be interpreted as estimates. Full accuracy metrics and methodology are described in [1].

Sobolev Spaces, Kernels and Discrepancies over Hyperspheres

Authors: Simon Hubbert, Emilio Porcu, Chris J. Oates, Mark Girolami

TMLR 2023 | Venue PDF | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical This work extends analytical foundations for kernel methods beyond the usual Euclidean manifold. Specifically, we characterise the smoothness of the native spaces (reproducing kernel Hilbert spaces) that are reproduced by geodesically isotropic kernels in the hyperspherical context. Our results are relevant to several areas of machine learning; we focus on their consequences for kernel cubature, determining the rate of convergence of the worst case error, and expanding the applicability of cubature algorithms based on Stein s method. First, we introduce a characterisation of Sobolev spaces on the d-dimensional sphere based on the Fourier Schoenberg sequences associated with a given kernel. Such sequences are hard (if not impossible) to compute analytically on d-dimensional spheres, but often feasible over Hilbert spheres, where d = . Second, we circumvent this problem by finding a projection operator that allows us to map from Hilbert spheres to finite-dimensional spheres. Our findings are illustrated for selected parametric families of kernel.
Researcher Affiliation Academia Simon Hubbert EMAIL Birkbeck, University of London Emilio Porcu EMAIL Khalifa University Trinity College Dublin Chris J. Oates EMAIL Newcastle University The Alan Turing Institute Mark Girolami EMAIL University of Cambridge The Alan Turing Institute
Pseudocode No The paper is highly mathematical, presenting definitions, theorems, and proofs. It does not contain any sections labeled "Pseudocode" or "Algorithm," nor does it present structured, code-like blocks for a method or procedure.
Open Source Code No The paper does not contain any explicit statements or links indicating that the authors have released open-source code for the methodology described in this work. It mentions other methods and related work but not its own code release.
Open Datasets No The paper is purely theoretical, focusing on mathematical characterization and proofs for kernel methods. It does not conduct empirical studies or experiments that would involve the use of datasets.
Dataset Splits No As the paper is theoretical and does not involve empirical experiments using datasets, there are no mentions of dataset splits for training, validation, or testing.
Hardware Specification No The paper is theoretical and presents mathematical derivations and proofs. It does not describe any experiments that would require specific hardware, and therefore, no hardware specifications are mentioned.
Software Dependencies No The paper is theoretical and focuses on mathematical foundations. It does not implement or describe any software for which dependencies or specific version numbers would be relevant.
Experiment Setup No The paper is theoretical, presenting mathematical characterizations and proofs. It does not describe any experimental procedures, hyperparameters, or training configurations.