Mean-Field Langevin Dynamics for Signed Measures via a Bilevel Approach
Authors: Guillaume Wang, Alireza Mousavi-Hosseini, Lénaïc Chizat
NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | See Fig. 1 for an illustrative numerical experiment. Note that our simulations of Brownian motion are not exact. The code to reproduce this experiment can be found at https://github.com/mousavih/ 2024-MFLD-bilevel. Neur IPS Paper Checklist: The contributions of this work are theoretical. A numerical illustration is given in Fig. 1 |
| Researcher Affiliation | Academia | Guillaume Wang 1 Alireza Mousavi-Hosseini 2 Lénaïc Chizat1 1École polytechnique fédérale de Lausanne 2University of Toronto and Vector Institute |
| Pseudocode | Yes | Algorithm 1 Annealing of the MFLD. Require: Functional J : P(W) R. Initialization η0, β0 > 0. Schedule K, (Tk)K k=0. 1: η0 0 = η0 2: for k = 0, . . . , K do 3: βk = 2kβ0 4: Run the MFLD with βk initialized from ηk 0 up to Tk, tηk t = div(ηk t J [ηk t ]) + 1 βk ηk t 5: ηk+1 0 = ηk Tk 6: end for 7: return ηK TK. |
| Open Source Code | Yes | The code to reproduce this experiment can be found at https://github.com/mousavih/ 2024-MFLD-bilevel. |
| Open Datasets | No | ρ is the empirical distribution of a (covariate) dataset (xi)i n of n = 100 training samples, sampled i.i.d. from N 0d 1 , with the last coordinate representing bias. The paper describes the generation of a synthetic dataset but does not provide access information (link, citation, or repository) to a publicly available version of this specific dataset. |
| Dataset Splits | No | The paper mentions "training samples" but does not specify any training/validation/test dataset splits (e.g., percentages or counts for each split). |
| Hardware Specification | No | The paper states that "any standard laptop or desktop computer can be used to reproduce it in, with a runtime of a few minutes." This is a general statement and does not provide specific hardware details (e.g., CPU/GPU models, memory amounts). |
| Software Dependencies | No | The paper mentions that code is available at a GitHub link, but it does not explicitly list specific software dependencies with their version numbers within the text itself (e.g., Python version, PyTorch version, etc.). |
| Experiment Setup | Yes | We consider the problem (1.1) where W = Sd and G is defined as in Assumption 2, where d = 10, λ = 10 3... For the algorithms using MFLD, we used β 1 = 10 3. We ran the Euler-Maruyama discretization of the noisy particle gradient flow SDE described in Sec. 2... using N = 1000 particles... and a step size of 10 2 for (1a) and 10 3 for (1b)... the wi 0 are drawn i.i.d. uniformly on Sd, and for the algorithms using the lifting formulation, the ri 0 are drawn i.i.d. from N(0, 1). |