Tighter Sparse Approximation Bounds for ReLU Neural Networks

Authors: Carles Domingo-Enrich, Youssef Mroueh

ICLR 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical In this work, we extend the framework of (Ongie et al., 2019) and define similar Radon-based semi-norms (R, U-norms) such that a function admits an infinite-width neural network representation on a bounded open set U Rd when its R, U-norm is finite. Building on this, we derive sparse (finite-width) neural network approximation bounds that refine those of Breiman (1993); Klusowski & Barron (2018). Finally, we show that infinite-width neural network representations on bounded open sets are not unique and study their structure, providing a functional view of mode connectivity.
Researcher Affiliation Academia Anonymous authors Paper under double-blind review
Pseudocode No The paper does not contain structured pseudocode or algorithm blocks.
Open Source Code No The paper does not provide concrete access to source code for the methodology described.
Open Datasets No The paper is theoretical and does not use datasets for empirical evaluation.
Dataset Splits No The paper is theoretical and does not describe experiments or dataset splits for validation.
Hardware Specification No The paper is theoretical and does not describe experiments or provide specific hardware details.
Software Dependencies No The paper is theoretical and does not describe experiments or provide specific ancillary software details with version numbers.
Experiment Setup No The paper is theoretical and does not describe experiments or provide specific experimental setup details.