Concavity of reweighted Kikuchi approximation
Authors: Po-Ling Loh, Andre Wibisono
NeurIPS 2014 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We conclude with simulations that demonstrate the advantages of the reweighted Kikuchi approach. |
| Researcher Affiliation | Academia | Po-Ling Loh Department of Statistics The Wharton School University of Pennsylvania loh@wharton.upenn.edu Andre Wibisono Computer Science Division University of California, Berkeley wibisono@berkeley.edu |
| Pseudocode | No | Section 4 describes the algorithm using mathematical equations for message updates (16) and pseudomarginal computations (17), but these are not presented in a formal pseudocode block or algorithm structure. |
| Open Source Code | No | The paper does not provide any information about the availability of open-source code for the methodology described. |
| Open Datasets | No | The paper generates random potential functions for its experiments on complete graphs and toroidal grid graphs; it does not use or provide access to a public dataset. |
| Dataset Splits | No | The paper conducts simulations with specific parameters but does not describe conventional train, validation, or test dataset splits. |
| Hardware Specification | No | The paper does not provide specific details about the hardware used to run the experiments. |
| Software Dependencies | No | The paper does not mention any specific software dependencies or their version numbers, such as programming languages, libraries, or solvers. |
| Experiment Setup | Yes | We use a damping factor of λ = 0.5, convergence threshold of 10 10 for the average change of messages, and at most 2500 iterations. We repeat this process with at least 8 random initializations for each value of . |