Gauging Variational Inference
Authors: Sung-Soo Ahn, Michael Chertkov, Jinwoo Shin
NeurIPS 2017 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Our extensive experiments indeed confirm that the proposed algorithms outperform and generalize MF and BP. |
| Researcher Affiliation | Academia | School of Electrical Engineering, Korea Advanced Institute of Science and Technology, Daejeon, Korea 1 Theoretical Division, T-4 & Center for Nonlinear Studies, Los Alamos National Laboratory, Los Alamos, NM 87545, USA, 2Skolkovo Institute of Science and Technology, 143026 Moscow, Russia |
| Pseudocode | Yes | Algorithm 1 Gauged mean-field; Algorithm 2 Gauged belief propagation |
| Open Source Code | No | The paper does not provide concrete access to source code, such as a repository link or an explicit statement of code release. It only mentions 'The running time of the implemented algorithms are reported in the supplementary material.', which implies code exists but not that it's open source. |
| Open Datasets | No | The paper states, 'We generate random GMs with factors dependent on the interaction strength parameters {βa}a V (akin inverse temperature)... See the supplementary material for additional information on how we generate the random models.' This indicates custom-generated data but does not provide concrete access information (link, DOI, or standard dataset name with citation) for a publicly available dataset. |
| Dataset Splits | No | The paper does not provide specific dataset split information (exact percentages, sample counts, or citations to predefined splits) for training, validation, or testing. |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., GPU/CPU models, processor types, or memory amounts) used for running its experiments. |
| Software Dependencies | No | The paper mentions 'we use the generic optimization solver IPOPT [33]' but does not provide a specific version number for IPOPT. |
| Experiment Setup | Yes | We generate random GMs with factors dependent on the interaction strength parameters {βa}a V (akin inverse temperature) according to: fa(xa) = exp( βa|h0(xa) h1(xa)|), where h0 and h1 count numbers of 0 and 1 contributions in xa, respectively. ... In the first set of experiments, we consider relatively small, complete graphs with two types of factors: random generic (non-log-supermodular) factors and log-supermodular (positive/ferromagnetic) factors. ... In the second set of experiments, we consider more sparse, larger graphs of two types: 3-regular and grid graphs with size up to 200 factors/300 variables. |