Nearly-tight Bounds for Deep Kernel Learning

Authors: Yifan Zhang, Min-Ling Zhang

ICML 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical we develop an analysis method based on the composite relationship of function classes and derive capacity-based bounds with mild dependence on the depth, which generalizes learning theory bounds to deep kernels and serves as theoretical guarantees for the generalization of DKL. In this paper, we prove novel and nearly-tight generalization bounds based on the uniform covering number and the Rademacher chaos complexity for deep (multiple) kernel machines.
Researcher Affiliation Academia 1School of Cyber Science and Engineering, Southeast University, Nanjing 210096, China 2Key Laboratory of Computer Network and Information Integration (Southeast University), Ministry of Education, China 3School of Computer Science and Engineering, Southeast University, Nanjing 210096, China. Correspondence to: Min-Ling Zhang <zhangml@seu.edu.cn>.
Pseudocode No The paper does not contain any sections or figures explicitly labeled as 'Pseudocode' or 'Algorithm' blocks.
Open Source Code No The paper does not provide any explicit statements about releasing source code or links to a code repository.
Open Datasets No The paper defines a generic dataset 'D' for theoretical analysis but does not specify a named public dataset or provide access information for any dataset used in experiments.
Dataset Splits No The paper is theoretical and does not describe empirical experiments, thus no training/validation/test dataset splits are provided.
Hardware Specification No The paper does not report any computational experiments, and therefore no hardware specifications are mentioned.
Software Dependencies No The paper is theoretical and does not describe empirical experiments, thus no specific software dependencies with version numbers are mentioned.
Experiment Setup No The paper is theoretical and does not describe any experimental setup details such as hyperparameters or training configurations.