A General Theory of Equivariant CNNs on Homogeneous Spaces

Authors: Taco S. Cohen, Mario Geiger, Maurice Weiler

NeurIPS 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical We present a general theory of Group equivariant Convolutional Neural Networks (G-CNNs) on homogeneous spaces such as Euclidean space and the sphere... This paper does not contain truly new mathematics (in the sense that a professional mathematician with expertise in the relevant subjects would not be surprised by our results), but instead provides a new formalism for the study of equivariant convolutional networks.
Researcher Affiliation Collaboration Taco S. Cohen Qualcomm AI Research Qualcomm Technologies Netherlands B.V. tacos@qti.qualcomm.com Mario Geiger PCSL Research Group EPFL mario.geiger@epfl.ch Maurice Weiler QUVA Lab U. of Amsterdam m.weiler@uva.nl
Pseudocode No The paper focuses on theoretical derivations and concepts, and does not contain structured pseudocode or algorithm blocks.
Open Source Code No The paper discusses implementation aspects generally and references other works ([7], [10]), but does not provide a specific link or explicit statement for the open-sourcing of code developed for the theory presented in this paper.
Open Datasets No This is a theoretical paper and does not describe any experiments or datasets, thus no information about public datasets for training is provided.
Dataset Splits No This is a theoretical paper and does not describe any experiments, therefore no specific dataset split information for validation is provided.
Hardware Specification No This is a theoretical paper and does not describe any experiments, therefore no specific hardware details are provided.
Software Dependencies No This is a theoretical paper and does not describe any implementations with specific software dependencies or version numbers.
Experiment Setup No This is a theoretical paper and does not describe any experiments or their setup, including hyperparameters or system-level training settings.