Generalized Schrödinger Bridge Matching
Authors: Guan-Horng Liu, Yaron Lipman, Maximilian Nickel, Brian Karrer, Evangelos Theodorou, Ricky T. Q. Chen
ICLR 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We empirically validate our claims on an extensive suite of experimental setups, including crowd navigation, opinion depolarization, Li DAR manifolds, and image domain transfer. |
| Researcher Affiliation | Collaboration | 1Georgia Institute of Technology 2Weizmann Institute of Science 3FAIR, Meta |
| Pseudocode | Yes | Algorithm 1 match (implicit) |
| Open Source Code | No | The paper acknowledges the use of several open-source libraries and mentions the official implementation of a baseline (Deep GSB), but does not explicitly state that the code for their proposed GSBM method is open-source or provide a link to it. |
| Open Datasets | Yes | AFHQ (Choi et al., 2020), Li DAR scans of Mt. Rainier (Legg & Anderson, 2013) |
| Dataset Splits | No | The paper describes various experimental setups and tasks, including training processes, but does not explicitly provide specific dataset split information (e.g., percentages or sample counts for training, validation, and test sets). |
| Hardware Specification | No | The paper does not provide specific hardware details such as GPU/CPU models, memory amounts, or cloud computing instance specifications used for running its experiments. |
| Software Dependencies | No | The paper lists several software dependencies such as 'Py Torch (Paszke et al., 2019)' and 'JAX (Bradbury et al., 2018)', providing publication years for their respective papers, but does not specify exact version numbers for these libraries (e.g., PyTorch 1.9 or JAX 0.3). |
| Experiment Setup | Yes | Table 7: Hyperparameters of the Spline Opt (Alg. 3) for each task. By default, the generation processes are discretized into 1000 steps, except for the opinion depolarization task, where we follow Deep GSB setup and discretize into 300 steps. All networks are trained from scratch, without utilizing any pretrained checkpoint, and optimized with Adam W (Loshchilov & Hutter, 2019). |