Minimum Weight Perfect Matching via Blossom Belief Propagation
Authors: Sung-Soo Ahn, Sejun Park, Michael Chertkov, Jinwoo Shin
NeurIPS 2015 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | In this paper, we develop the first such algorithm, coined Blossom-BP, for solving the minimum weight matching problem over arbitrary graphs. Each step of the sequential algorithm requires applying BP over a modified graph constructed by contractions and expansions of blossoms, i.e., odd sets of vertices. Our scheme guarantees termination in O(n2) of BP runs, where n is the number of vertices in the original graph. In essence, the Blossom-BP offers a distributed version of the celebrated Edmonds Blossom algorithm by jumping at once over many sub-steps with a single BP. Moreover, our result provides an interpretation of the Edmonds algorithm as a sequence of LPs. |
| Researcher Affiliation | Collaboration | School of Electrical Engineering, Korea Advanced Institute of Science and Technology, Daejeon, Korea Theoretical Division and Center for Nonlinear Studies, Los Alamos National Laboratory, Los Alamos, USA |
| Pseudocode | Yes | Blossom-LP algorithm A. Solving LP on a contracted graph. ... B. Updating parameters. ... C. Termination. ... |
| Open Source Code | No | The paper does not provide any concrete access to source code for the described methodology. |
| Open Datasets | No | This is a theoretical paper and does not mention the use of any specific datasets for training. |
| Dataset Splits | No | This is a theoretical paper and does not discuss dataset splits for training, validation, or testing. |
| Hardware Specification | No | The paper is theoretical and does not mention any specific hardware used for experiments. |
| Software Dependencies | No | The paper is theoretical and does not list any specific software dependencies with version numbers. |
| Experiment Setup | No | The paper is theoretical and does not provide details about an experimental setup, hyperparameters, or training configurations. |