Bayesian Poisson Tucker Decomposition for Learning the Structure of International Relations
Authors: Aaron Schein, Mingyuan Zhou, David Blei, Hanna Wallach
ICML 2016 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We show that BPTD yields an efficient MCMC inference algorithm and achieves better predictive performance than related models. We also demonstrate that it discovers interpretable latent structure that agrees with our knowledge of international relations. |
| Researcher Affiliation | Collaboration | Aaron Schein ASCHEIN@CS.UMASS.EDU University of Massachusetts Amherst; Mingyuan Zhou MINGYUAN.ZHOU@MCCOMBS.UTEXAS.EDU University of Texas at Austin; David M. Blei DAVID.BLEI@COLUMBIA.EDU Columbia University; Hanna Wallach WALLACH@MICROSOFT.COM Microsoft Research New York City |
| Pseudocode | No | The paper describes the MCMC inference algorithm and compositional allocation method verbally and with equations, but it does not provide a structured pseudocode block or algorithm listing. |
| Open Source Code | No | The paper does not include an explicit statement about releasing code or a link to a source code repository for the described methodology. |
| Open Datasets | Yes | Our data come from the Integrated Crisis Early Warning System (ICEWS) of Boschee et al. and the Global Database of Events, Language, and Tone (GDELT) of Leetaru & Schrodt (2013). |
| Dataset Splits | No | The paper describes a training tensor and a test tensor, with the test tensor further divided into an observed and held-out portion. However, it does not explicitly mention a separate 'validation' dataset or its split details for model tuning. |
| Hardware Specification | No | The paper does not provide specific details about the hardware (e.g., GPU/CPU models, memory) used to run the experiments. |
| Software Dependencies | No | The paper does not list specific software dependencies (e.g., library names with version numbers) required to replicate the experiments. |
| Experiment Setup | Yes | Section 6, 'Experimental setup', details hyperparameter values (ϵ0 to 0.1, γ0 calculation), latent dimension cardinalities (C = 20, K = 6, R = 3), and MCMC inference iterations (5,000 for training, 1,000 for test inference, saving every tenth sample after the first 500). |