Debiased Sinkhorn barycenters

Authors: Hicham Janati, Marco Cuturi, Alexandre Gramfort

ICML 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Empirically, we illustrate the reduced blurring and the computational advantage on various applications.
Researcher Affiliation Collaboration 1Inria Saclay, France 2CREST-ENSAE, France 3Google Research, Brain team, France.
Pseudocode Yes Algorithm 1 Debiased Sinkhorn Barycenter
Open Source Code Yes Python code can be found at https://github.com/hichamjanati/debiased-ot-barycenters.
Open Datasets Yes We take 500 samples of the MNIST dataset (Le Cun & Cortes, 2010)
Dataset Splits Yes We select 10% of the dataset (a subset of 50 images; ergo K=50) at random as our learning dictionary A and compute the barycentric coordinates of the remaining 90% subset denoted as D. ... We train a random forest classifier using the Scikit-learn library (Pedregosa et al., 2011) on this learned embedding) and compute a 10-fold cross-validation.
Hardware Specification Yes All 6 barycenters were computed on a laptop with an Intel Core i5 3.1 GHz Processor.
Software Dependencies No The paper mentions 'Scikit-learn library (Pedregosa et al., 2011)' and 'pyTorch library (Paszke et al., 2017)' but does not specify their version numbers.
Experiment Setup Yes We set the cost matrix C to the squared Euclidean distance on the unit square and set " = 0.002. We use the same termination criterion for all methods based on a maximum relative change of the barycenters set to 10 5. ... We set the cost matric C to the squared Euclidean distance on the unit cube and set " = 0.01. ... We select 10% of the dataset (a subset of 50 images; ergo K=50) at random as our learning dictionary A.