Fixed-Support Wasserstein Barycenters: Computational Hardness and Fast Algorithm
Authors: Tianyi Lin, Nhat Ho, Xi Chen, Marco Cuturi, Michael Jordan
NeurIPS 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Finally, we conduct extensive experiments with both synthetic data and real images and demonstrate the favorable performance of the FASTIBP algorithm in practice. In this section, we evaluate the FASTIBP algorithm for computing fixed-support Wasserstein barycenters. In all our experiments, we consider the Wasserstein distance with ℓ2-norm and compare our algorithm with Gurobi, iterative Bregman projection (IBP) algorithm [4] and Bregman ADMM [57]. |
| Researcher Affiliation | Collaboration | Tianyi Lin Nhat Ho Xi Chen Marco Cuturi , Michael I. Jordan University of California, Berkeley University of Texas, Austin Stern School of Business, New York University CREST ENSAE , Google Brain |
| Pseudocode | Yes | Algorithm 1: FASTIBP({Ck, uk}k [m], ε) and Algorithm 2: Finding a Wasserstein barycenter by the FASTIBP algorithm |
| Open Source Code | No | We implement ADMM [56], APDAGD [32] and accelerated IBP [29] and find that they perform worse than our algorithm. However, we believe it is largely due to our own implementation issue since these algorithms require fine hyper-parameter tuning. We are also unaware of any public codes available online. Thus, we exclude them for a fair comparison. |
| Open Datasets | Yes | MNIST images. We apply the FASTIBP algorithm (η = 0.001) to compute the Wasserstein barycenter of the resulting images for each digit on the MNIST dataset and compare it to IBP (η = 0.001). |
| Dataset Splits | No | The paper evaluates performance on synthetic data and MNIST images but does not explicitly detail the training, validation, or test data splits used for reproducibility. |
| Hardware Specification | No | The paper describes experiments on synthetic and real data but does not provide any specific details about the hardware (e.g., CPU, GPU models, memory) used for computations. |
| Software Dependencies | No | The paper mentions using 'Gurobi' and implementing various algorithms but does not provide specific version numbers for any software dependencies. |
| Experiment Setup | Yes | In all our experiments, we consider the Wasserstein distance with ℓ2-norm and compare our algorithm with Gurobi, iterative Bregman projection (IBP) algorithm [4] and Bregman ADMM [57]. In our figures, g' stands for Gurobi; b' stands for BADMM; i1' and i2' stand for IBP with η = 0.01 and η = 0.001; f1' and f2' stand for the FASTIBP algorithm with η = 0.01 and η = 0.001. Given n = 100, we evaluate the performance of FASTIBP, IBP, BADMM algorithms, and Gurobi by varying m {20, 50, 100, 200}. We fix Tolfibp = 10 6 but without setting the maximum iteration number. |