Optimal approximation using complex-valued neural networks

Authors: Paul Geuchen, Felix Voigtlaender

NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical We thus analyze the expressivity of CVNNs by studying their approximation properties. Our results yield the first quantitative approximation bounds for CVNNs that apply to a wide class of activation functions including the popular mod Re LU and complex cardioid activation functions.
Researcher Affiliation Academia Paul Geuchen MIDS, KU Eichstätt-Ingolstadt, Auf der Schanz 49, 85049 Ingolstadt, Germany paul.geuchen@ku.de; Felix Voigtlaender MIDS, KU Eichstätt-Ingolstadt, Auf der Schanz 49, 85049 Ingolstadt, Germany felix.voigtlaender@ku.de
Pseudocode No The paper does not contain any pseudocode or algorithm blocks. It focuses on mathematical proofs and theoretical analysis.
Open Source Code No The paper does not provide any statements about releasing open-source code or links to a code repository.
Open Datasets No This is a theoretical paper and does not use datasets for training.
Dataset Splits No This is a theoretical paper and does not involve empirical validation splits for datasets.
Hardware Specification No This is a theoretical paper and does not describe hardware used for experiments.
Software Dependencies No This is a theoretical paper and does not list software dependencies with version numbers for experimental reproducibility.
Experiment Setup No This is a theoretical paper and does not provide details on an experimental setup, hyperparameters, or training configurations.