Minimax Estimation of Maximum Mean Discrepancy with Radial Kernels

Authors: Ilya O. Tolstikhin, Bharath K. Sriperumbudur, Bernhard Schölkopf

NeurIPS 2016 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical In this paper, we present the first known lower bounds for the estimation of MMD based on finite samples. Our lower bounds hold for any radial universal kernel on Rd and match the existing upper bounds up to constants that depend only on the properties of the kernel. Using these lower bounds, we establish the minimax rate optimality of the empirical estimator and its U-statistic variant, which are usually employed in applications.
Researcher Affiliation Academia Ilya Tolstikhin Department of Empirical Inference MPI for Intelligent Systems Tübingen 72076, Germany ilya@tuebingen.mpg.de Bharath K. Sriperumbudur Department of Statistics Pennsylvania State University University Park, PA 16802, USA bks18@psu.edu Bernhard Schölkopf Department of Empirical Inference MPI for Intelligent Systems Tübingen 72076, Germany bs@tuebingen.mpg.de
Pseudocode No The paper is theoretical and focuses on mathematical proofs; no pseudocode or algorithm blocks are provided.
Open Source Code No The paper is theoretical and does not mention releasing any source code or providing links to a code repository.
Open Datasets No The paper is theoretical and does not involve experimental data or datasets, thus no information on public datasets for training is provided.
Dataset Splits No The paper is theoretical and does not involve data splits for training, validation, or testing.
Hardware Specification No The paper is purely theoretical and does not involve any computational experiments, thus no hardware specifications are mentioned.
Software Dependencies No The paper is purely theoretical and does not involve any computational experiments, thus no software dependencies with version numbers are mentioned.
Experiment Setup No The paper is theoretical and does not describe any experimental setups, hyperparameters, or training configurations.