Bayesian Pose Graph Optimization via Bingham Distributions and Tempered Geodesic MCMC
Authors: Tolga Birdal, Umut Simsekli, Mustafa Onur Eken, Slobodan Ilic
NeurIPS 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We devise theoretical convergence guarantees and extensively evaluate our method on synthetic and real benchmarks. |
| Researcher Affiliation | Collaboration | Tolga Birdal1,2 Umut Sim sekli3 M. Onur Eken1,2 Slobodan Ilic1,2 1 CAMP Chair, Technische Universität München, 85748, München, Germany 2 Siemens AG, 81739, München, Germany 3 LTCI, Télécom Paris Tech, Université Paris-Saclay, 75013, Paris, France |
| Pseudocode | No | The paper describes the numerical integration steps and refers to the solutions in the supplementary material but does not include a structured pseudocode block or a clearly labeled algorithm figure in the main body of the paper. |
| Open Source Code | No | The paper does not contain an explicit statement about releasing the source code for their proposed methodology, nor does it provide a direct link to a code repository. |
| Open Datasets | Yes | We now evaluate our framework by running SFM on the EPFL Benchmark [60], that provide 8 to 30 images per dataset, along with ground-truth camera transformations. |
| Dataset Splits | No | The paper describes using synthetic and real benchmarks but does not specify the exact percentages or sample counts for training, validation, or test splits. It mentions 'distort the graph' and 'randomly dropping (100|E|/N 2)% edges' for synthetic data but not clear splits. |
| Hardware Specification | No | The paper does not provide specific details about the hardware used to run the experiments, such as GPU models, CPU types, or memory specifications. |
| Software Dependencies | No | The paper mentions software like 'VSFM [63]' and 'Ceres solver [65]' that were used, but it does not specify version numbers for these or any other software dependencies. |
| Experiment Setup | Yes | To do so, we first run our optimizer setting β to infinity2 for > 400 iterations. After that point, depending on the dataset, we set β to a smaller value ( 1000), allowing the sampling of posterior for 40 times. |