Notice: The reproducibility variables underlying each score are classified using an automated LLM-based pipeline, validated against a manually labeled dataset. LLM-based classification introduces uncertainty and potential bias; scores should be interpreted as estimates. Full accuracy metrics and methodology are described in [1].

Slicing the Gaussian Mixture Wasserstein Distance

Authors: Moritz Piening, Robert Beinert

TMLR 2025 | Venue PDF | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Furthermore, we validate the effectiveness of our approach through numerical experiments, demonstrating computational efficiency and applications in clustering, perceptual image comparison, and GMM minimization.
Researcher Affiliation Academia Moritz Piening EMAIL Institut für Mathematik Technische Universität Berlin Robert Beinert EMAIL Institut für Mathematik Technische Universität Berlin
Pseudocode Yes Algorithm 1 Implementation of MSW Algorithm 2 Implementation of DSMW
Open Source Code Yes Our implementation1 is based on Python 3.12 and employs the Python optimal transport library (POT) (Flamary et al., 2021) for optimal transport solvers and PyTorch (Paszke et al., 2019) for automatic differentiation. 1https://github.com/Moe Pien/sliced_OT_for_GMMs
Open Datasets Yes For the real image set, we employ a subset of 1000 CIFAR10 images (Krizhevsky, 2009). ... we use two-dimensional Gaussian mixtures with 100 components in the form of 28 x 28 MNIST digits (Le Cun et al., 1998)
Dataset Splits No The paper uses subsets of CIFAR10 and MNIST datasets for various experiments (perceptual metrics, GMM quantization, barycenters) but does not specify any training/testing/validation splits for these datasets within the context of the paper's experimental methodology.
Hardware Specification Yes Experiments were conducted on a system equipped with a 13th Gen Intel Core i5-13600K CPU and an NVIDIA GeForce RTX 3060 GPU with 12 GB of memory.
Software Dependencies No Our implementation1 is based on Python 3.12 and employs the Python optimal transport library (POT) (Flamary et al., 2021) for optimal transport solvers and PyTorch (Paszke et al., 2019) for automatic differentiation. While Python 3.12 is specified, specific version numbers for POT and PyTorch are not provided.
Experiment Setup Yes For this, we denote the set of lower triangular matrices with non-negative diagonal by Tri^0(d). Given a parameter vector ρ := (w_k, m_k, Q_k)_{k=1}^K with w_k ∈ R, m_k ∈ R^d, and Q_k ∈ Tri^0(d), we consider the parametrized GMM: ... Using a fixed number of random directions θ_ℓ in (12), we employ the Adam scheme (Kingma & Ba, 2014) in combination with automatic differentiation to minimize min_ρ \hat{DSMW}^2_L(µ_ρ, µ). ... Applying Adam with step size 0.03 for 200 iterations and 20 random initializations, and choosing L = 100 and σ = 1 (pixel), we quantize the inputs by 50-component GMMs.