Quantitative Convergences of Lie Group Momentum Optimizers

Authors: Lingkai Kong, Molei Tao

NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental 6 Systematic numerical verification via the eigen decomposition problem and 7 Application to Vision Transformer. Figure 1(a) shows Numerical estimation for 1 c under different condition numbers κ = L µ for Heavy-Ball and NAGSC.
Researcher Affiliation Academia Lingkai Kong School of Mathematics Georgia Institute of Technology lkong75@gate.edu Molei Tao School of Mathematics Georgia Institute of Technology mtao@gate.edu
Pseudocode Yes Algorithm 1: Momentum optimizer on Lie groups
Open Source Code Yes Code can be found at https://github.com/konglk1203/Accelerated_Optimizer_On_Lie_Group
Open Datasets Yes Fig. 3 and Tab. 2 are the validation error when we train a vision transformer [2] with 6.3M parameters from scratch on CIFAR, showing an improvement of Lie NAG-SC comparing the state-of-the-art algorithm Lie Heavy-Ball.
Dataset Splits Yes Fig. 3 and Tab. 2 are the validation error when we train a vision transformer [2] with 6.3M parameters from scratch on CIFAR, showing an improvement of Lie NAG-SC comparing the state-of-the-art algorithm Lie Heavy-Ball.
Hardware Specification Yes In all experiments, we set n = 10, and the computations are done on a Mac Book Pro (M1 chip, 8GB memory). ... The computations are done on a single Nvidia V100 GPU.
Software Dependencies No The paper does not provide specific version numbers for software dependencies such as programming languages or libraries used.
Experiment Setup Yes Such estimation is used to choose our parameters (γ and h) in all experiments as stated in Table 1. ... The model structures and hyperparameters are identical as Sec. 3.2 in [19].