Neural tangent kernels, transportation mappings, and universal approximation

Authors: Ziwei Ji, Matus Telgarsky, Ruicheng Xian

ICLR 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical This paper establishes rates of universal approximation for the shallow neural tangent kernel (NTK): network weights are only allowed microscopic changes from random initialization, which entails that activations are mostly unchanged, and the network is nearly equivalent to its linearization. Concretely, the paper has two main contributions: a generic scheme to approximate functions with the NTK by sampling from transport mappings between the initial weights and their desired values, and the construction of transport mappings via Fourier transforms.
Researcher Affiliation Academia Ziwei Ji, Matus Telgarsky, Ruicheng Xian Department of Computer Science University of Illinois at Urbana-Champaign {ziweiji2,mjt,rxian2}@illinois.edu
Pseudocode No No pseudocode or algorithm blocks were found in the paper.
Open Source Code No The paper does not provide any statement or link about the availability of open-source code for the methodology described.
Open Datasets No No datasets are used as the paper is purely theoretical. Therefore, no information about public dataset access is provided.
Dataset Splits No No dataset split information is provided as the paper is purely theoretical and does not involve empirical evaluation.
Hardware Specification No No hardware specifications are mentioned as the paper is purely theoretical and does not describe any experiments.
Software Dependencies No No specific software dependencies with version numbers are mentioned, as the paper is theoretical and does not describe experimental implementations.
Experiment Setup No No experimental setup details or hyperparameters are provided, as the paper is theoretical and does not involve empirical evaluation.