Clifford-Steerable Convolutional Neural Networks

Authors: Maksim Zhdanov, David Ruhe, Maurice Weiler, Ana Lucic, Johannes Brandstetter, Patrick Forré

ICML 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental 4. Experimental Results To assess CS-CNNs, we investigate how well they can learn to simulate dynamical systems by testing their ability to predict future states given a history of recent states (Gupta & Brandstetter, 2022). We consider three tasks: (1) Fluid dynamics on R2 (incompressible Navier-Stokes) (2) Electrodynamics on R3 (Maxwell s Eqs.) (3) Electrodynamics on R1,2 (Maxwell s Eqs., relativistic)
Researcher Affiliation Collaboration 1AMLab, Informatics Institute, University of Amsterdam 2AI4Science Lab, Informatics Institute, University of Amsterdam 3Anton Pannekoek Institute for Astronomy, University of Amsterdam 4AI4Science, Microsoft Research 5ELLIS Unit Linz, Institute for Machine Learning, JKU Linz, Austria 6NXAI Gmb H.
Pseudocode Yes Function 1 SCALARSHELL input ηp,q, v Rp,q, σ. s sgn (ηp,q(v, v)) exp |ηp,q(v,v)| ... Function 2 CLIFFORDSTEERABLEKERNEL ... Function 3 CLIFFORDSTEERABLECONVOLUTION
Open Source Code Yes Appendix A provides details on our implementation of CS-CNNs, available at https://github.com/maxxxzdn/cliffo rd-group-equivariant-cnns.
Open Datasets Yes Navier Stokes: We use the Navier-Stokes data from Gupta & Brandstetter (2022), which is based on ΦFlow (Holl et al., 2020). ... Maxwell 3D: Simulations of the 3D Maxwell equations are taken from Brandstetter et al. (2023). ... Maxwell 2D: We simulate data for Maxwell s equations on spacetime R2,1 using Py Charge (Filipovich & Hughes, 2022).
Dataset Splits Yes For validation and testing, we randomly selected 1024 trajectories from corresponding partitions. ... The final dataset comprises 2048 training, 256 validation and 256 test simulations.
Hardware Specification Yes Training was done on a single node with 4 NVIDIA Ge Force RTX 2080 Ti GPUs.
Software Dependencies No The paper does not provide specific version numbers for software dependencies or libraries, only mentions names like 'torch.nn.Conv Nd' or 'jax.lax.conv'.
Experiment Setup Yes For Res Nets, we follow the setup of Wang et al. (2021); Brandstetter et al. (2023); Gupta & Brandstetter (2022): the Res Net baselines consist of 8 residual blocks, each comprising two convolution layers with 7 7 (or 7 7 7 for 3D) kernels, shortcut connections, group normalization (Wu & He, 2018), and Ge LU activation functions (Hendrycks & Gimpel, 2016). ... For optimization, we used Adam optimizer (Kingma & Ba, 2015) with no learning decay and cosine learning rate scheduler (Loshchilov & Hutter, 2017) to reduce the initial value by the factor of 0.01.