On a Combination of Alternating Minimization and Nesterov’s Momentum

Authors: Sergey Guminov, Pavel Dvurechensky, Nazarii Tupitsa, Alexander Gasnikov

ICML 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental The practical efficiency of the algorithm is demonstrated by a series of numerical experiments. In this section we apply our general accelerated AM method to a non-convex collaborative filtering problem. [...] In Figure 1 we compare the performance of AM and Algorithm 1 applied to the problem (7). [...] In Figure 2, we provide a numerical comparison of our methods with Sinkhorn s algorithm, the AAR-BCD method [...] We performed experiments using randomly chosen images from MNIST dataset. [...] In Section 6 we provide numerical experiment for least squares problem for linear regression.
Researcher Affiliation Academia 1Moscow Institute of Physics and Technology, Dolgoprudny, Russia 2Institute for Information Transmission Problems RAS, Moscow, Russia 3HDI Lab @ National Research University Higher School of Economics, Russian Federation 4Weierstrass Institute for Applied Analysis and Stochastics, Berlin, Germany.
Pseudocode Yes Algorithm 1 Accelerated Alternating Minimization (AAM) [...] Algorithm 2 Primal-Dual AAM
Open Source Code Yes Code for all presented algorithms is available at https:// github.com/nazya/AAM
Open Datasets Yes We generate the matrix {rui}u,i from Last.fm dataset 360K [...] We performed experiments using randomly chosen images from MNIST dataset. [...] We also illustrate the results by solving the alternating least squares problem on the Blog Feedback Data Set (Buza, 2014) obtained from UCI Machine Learning Repository.
Dataset Splits No The paper describes the datasets used and how they are partitioned for the alternating minimization method itself (e.g., into blocks), but does not provide explicit training, validation, and test split percentages or counts for reproducibility of data partitioning.
Hardware Specification No The paper does not provide any specific hardware details such as GPU/CPU models, memory, or cloud resources used for running the experiments.
Software Dependencies No The paper does not provide specific software dependencies with version numbers that would be needed to replicate the experiments (e.g., Python version, library versions like PyTorch or TensorFlow).
Experiment Setup Yes The regularization coefficient was set to λ = 0.1 [...] Parameter of entropic regularization γ = 5e 4. [...] Parameter of entropic regularization γ = 5e 5.