Lie Neurons: Adjoint-Equivariant Neural Networks for Semisimple Lie Algebras
Authors: Tzu-Yuan Lin, Minghan Zhu, Maani Ghaffari
ICML 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experiments are conducted for the so(3), sl(3), and sp(4) Lie algebras on various tasks, including fitting equivariant and invariant functions, learning system dynamics, point cloud registration, and homography-based shape classification. |
| Researcher Affiliation | Academia | 1University of Michigan, Ann Arbor, MI, USA. |
| Pseudocode | No | The paper describes methods in text and figures (e.g., Figure 5 shows architecture diagrams) but does not contain a formal 'Pseudocode' or 'Algorithm' block. |
| Open Source Code | Yes | The software implementation is available at https://github.com/UMich-CURLY/Lie Neurons. |
| Open Datasets | Yes | The network is trained and evaluated on Model Net40, which contains 3D models of objects in 40 categories. |
| Dataset Splits | No | The paper consistently mentions 'training' and 'testing' data, but does not explicitly provide details about a separate 'validation' dataset split for reproduction. |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., GPU/CPU models, memory) used for running its experiments. |
| Software Dependencies | No | The paper does not explicitly list specific software dependencies with version numbers. It refers readers to the GitHub repository for implementation details. |
| Experiment Setup | Yes | The network architecture used in this experiment can be found in Figure 5 in the Appendix. It consists of two LN-LB+LN-LR layers. The feature dimension of each layer is set to 1024, while the last linear layer projects the features back to dimension 3. |