The Wasserstein Proximal Gradient Algorithm

Authors: Adil Salim, Anna Korba, Giulia Luise

NeurIPS 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We provide numerical experiments with a ground truth target distribution µ to illustrate the dynamical behavior of the FB scheme, similarly to [34, Section 4.1].
Researcher Affiliation Academia Adil Salim Visual Computing Center KAUST adil.salim@kaust.edu.sa Anna Korba Gatsby Computational Neuroscience Unit University College London a.korba@ucl.ac.uk Giulia Luise Computer Science Department University College London g.luise16@ucl.ac.uk
Pseudocode No The paper describes the Forward Backward Euler scheme using mathematical equations (17) and (18) but does not provide structured pseudocode or an algorithm block.
Open Source Code No The paper does not provide any statement about releasing source code or a link to a code repository.
Open Datasets No The paper describes numerical experiments using generated Gaussian distributions ('µ0 is Gaussian with m0 = 10 and σ0 = 100') rather than a named, publicly accessible dataset with explicit access information or citation.
Dataset Splits No The paper describes numerical simulations to illustrate the scheme's behavior but does not specify training, validation, or test dataset splits.
Hardware Specification No The paper mentions running 'numerical experiments' but does not provide any specific details about the hardware used (e.g., GPU/CPU models, memory).
Software Dependencies No The paper does not specify any software dependencies with version numbers (e.g., Python, PyTorch, specific libraries).
Experiment Setup Yes We consider F(x) = 0.5|x|2, and H the negative entropy. [...] This allows to show the dynamical behavior of the FB scheme when γ = 0.1, and µ0 is Gaussian with m0 = 10 and σ0 = 100, in Figure 1. Note that λ = 1.0.