Notice: The reproducibility variables underlying each score are classified using an automated LLM-based pipeline, validated against a manually labeled dataset. LLM-based classification introduces uncertainty and potential bias; scores should be interpreted as estimates. Full accuracy metrics and methodology are described in [1].

Equivariant Symmetry Breaking Sets

Authors: YuQing Xie, Tess Smidt

TMLR 2024 | Venue PDF | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Finally, we provide some examples of symmetry breaking to demonstrate how our approach works in practice. The code for these examples is available at https://github.com/atomicarchitects/equivariant-SBS. 7 Experiments Here, we provide some example tasks where we apply our framework of full symmetry breaking and partial symmetry breaking to an equivariant message passing GNN. We consider the cases where we can find an ideal equivariant SBS or partial SBS. This section serves primarily as a proof of concept for how our approach works in practice.
Researcher Affiliation Academia Yu Qing Xie EMAIL Department of Electrical Engineering and Computer Science Massachusetts Institute of Technology Tess Smidt EMAIL Department of Electrical Engineering and Computer Science Massachusetts Institute of Technology
Pseudocode Yes Algorithm 1 Equivariant full SBS S Symmetry of input expressed as pair (g, name(S)) b Canonical symmetry breaking object Output B Symmetry breaking set expressed as a pair (b , N) (n, name(NO(3)(S))) Normalizers[name(S)] N (gn, name(NO(3)(S))) return (gb, N) Algorithm 2 Equivariant partial SBS from object S Symmetry of input expressed as pair (g S, name(S)) Cl S(K) Set of conjugate subgroups expressed as (S, K) = ((g S, name(S)), (g K, name(K))) p Canonical partial symmetry breaking object Output P Symmetry breaking set expressed as a pair (p , N) N1 Normalizers[name(S)] (n, name(N2)) Normalizers[name(K)] N2 (g 1 S g Kn, name(N2)) N N1 N2 N (e, name(S))N N g SNg 1 S return (g Sp, N) Algorithm 3 Ideal partial SBS generating object symmetry S Symmetry of input expressed as pair (g S, name(S)) Cl S(K) Set of conjugate subgroups expressed as (S, K) = ((g S, name(S)), (g K, name(K))) Output H Symmetry needed for object p to generate ideal partial SBS N1 Normalizers[name(S)] (n, name(N2)) Normalizers[name(K)] N2 (g 1 S g Kn, name(N2)) N N1 N2 N (e, name(S)) N2 (Q1, ϕ) Quotient[N, (g 1 S g K, K)] Q2 ϕ(N ) C Find Complement[Q1, Q2] if C exists then (h, name(H)) ϕ 1(C) return ((g Sh, name(H)), K) else return None end if
Open Source Code Yes The code for these examples is available at https://github.com/atomicarchitects/equivariant-SBS.
Open Datasets No The paper describes experiments on 'triangular prism', 'octagon to rectangle', and 'Ba Ti O3 phase transitions'. These appear to be synthetic examples or specific structural setups designed for the experiments, rather than openly available datasets with explicit download links, DOIs, or formal citations for data access. For 'Ba Ti O3', it references the 'materials project database Jain et al. (2013)' for symmetry information, but not as the source of a dataset used in the experiments. The input structures are described (e.g., 'Graph with 6 nodes... position features...'), suggesting they are constructed.
Dataset Splits No The paper describes training specific models for the experimental examples (triangular prism, octagon to rectangle, Ba Ti O3 phase transitions). While it mentions training, it does not specify any formal dataset splits such as train/validation/test percentages or counts. For instance, in Section 7.1.4, it states: "Since our framework is equivariant, we fix the orientation of the prism in training. We first fix a choice of one symmetry breaking object... and train the model..." This describes the training approach but not dataset partitioning for reproducibility.
Hardware Specification No The paper does not provide any specific hardware details such as GPU models, CPU types, or cloud computing resources used to run the experiments.
Software Dependencies Yes We apply our framework to the default equivariant message passing GNN implemented in the e3nn pytorch library Geiger & Smidt (2022). GAP. GAP Groups, Algorithms, and Programming, Version 4.12.2. The GAP Group, 2022. URL https://www.gap-system.org.
Experiment Setup Yes We speicify the irreps of the hidden features in our model up to l = 2 of both parities and use up to l = 4 spherical harmonics for point convolution filters. Further for the radial network, we use a 3 layer fully connected network with 16 hidden features in each layer. In the final layer, we specify a odd node l = 1 (vector) feature. We take the sum of the final output vectors as the model output. ... train the model to match the vector from the center of the prism to the chosen vertex using MSE loss.