Deep Generalized Schrödinger Bridge
Authors: Guan-Horng Liu, Tianrong Chen, Oswin So, Evangelos Theodorou
NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We validate our Deep GSB on two classes of MFGs, including classical crowd navigation (d=2) and high-dimensional (d=1000) opinion depolarization. For crowd navigation, we consider three MFGs appearing in prior methods [14, 15], including (i) asymmetric obstacle avoidance, (ii) entropy interaction with a V-shape bottleneck, and (iii) congestion interaction on an S-tunnel. ... We show that our proposed objective function provides necessary and sufficient conditions to the mean-field problem. Our method, named Deep Generalized Schrödinger Bridge (Deep GSB), not only outperforms prior methods in solving classical population navigation MFGs, but is also capable of solving 1000-dimensional opinion depolarization, setting a new state-of-the-art numerical solver for high-dimensional MFGs. |
| Researcher Affiliation | Academia | 1Georgia Institute of Technology, USA 2Massachusetts Institute of Technology, USA {ghliu, tianrong.chen, evangelos.theodorou}@gatech.edu oswinso@mit.edu |
| Pseudocode | Yes | Algorithm 1 Deep Generalized Schrödinger Bridge (Deep GSB) |
| Open Source Code | Yes | Our code will be made available at https://github.com/ghliu/Deep GSB. |
| Open Datasets | No | The data is generated randomly at each training iteration. For crowd navigation, we consider three MFGs appearing in prior methods [14, 15], including (i) asymmetric obstacle avoidance, (ii) entropy interaction with a V-shape bottleneck, and (iii) congestion interaction on an S-shape tunnel. ... For opinion depolarization, we set ρ0 and ρtarget to two zero-mean Gaussians with varying variances for representing the initially polarized and desired moderated opinion distributions. |
| Dataset Splits | No | No explicit train/validation/test dataset splits are provided. The paper mentions data is 'generated randomly at each training iteration'. |
| Hardware Specification | Yes | All experiments were run on a single NVIDIA A100 GPU. |
| Software Dependencies | No | The code is written in PyTorch [71]. |
| Experiment Setup | Yes | All networks adopt sinusoidal time embeddings and are trained with Adam W [56]. All SDEs in (11, 12) are solved with the Euler-Maruyama method. ... we leave the discussion of critic parametrization Deep GSB-c, along with additional experimental details, to Appendix A.5. |