Approximate maximum entropy principles via Goemans-Williamson with applications to provable variational methods

Authors: Andrej Risteski, Yuanzhi Li

NeurIPS 2016 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical We provide computationally efficient versions of this principle when the mean parameters are pairwise moments: we design distributions that approximately match given pairwise moments, while having entropy which is comparable to the maximum entropy distribution matching those moments. We additionally provide surprising applications of the approximate maximum entropy principle to designing provable variational methods for partition function calculations for Ising models without any assumptions on the potentials of the model. More precisely, we show that we can get approximation guarantees for the log-partition function comparable to those in the low-temperature limit, which is the setting of optimization of quadratic forms over the hypercube.
Researcher Affiliation Academia Yuanzhi Li Department of Computer Science Princeton University Princeton, NJ, 08450 yuanzhil@cs.princeton.edu Andrej Risteski Department of Computer Science Princeton University Princeton, NJ, 08450 risteski@cs.princeton.edu
Pseudocode No The paper does not contain structured pseudocode or algorithm blocks.
Open Source Code No The paper does not provide any concrete access information (e.g., repository link, explicit statement of code release) for the source code of the methodology described.
Open Datasets No The paper is theoretical and does not use or reference any datasets for training. Thus, no information on public availability of training data is provided.
Dataset Splits No The paper is theoretical and does not involve experimental validation on datasets. Therefore, no information regarding validation dataset splits is provided.
Hardware Specification No The paper is theoretical and does not describe any specific hardware used for running experiments.
Software Dependencies No The paper is theoretical and does not specify any software dependencies with version numbers.
Experiment Setup No The paper is theoretical and does not describe any specific experimental setup details such as hyperparameters or training configurations.