Generalized Direct Change Estimation in Ising Model Structure

Authors: Farideh Fazayeli, Arindam Banerjee

ICML 2016 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experimental results illustrating the effectiveness of the proposed estimator are presented. In this Section, we evaluate generalized direct change estimator (direct) with three different norms. and we compare our direct approach with indirect approach. For indirect approach, we first estimate Ising model structures ˆθ1 and ˆθ2 with L1 norm regularizer, separately (Ravikumar et al., 2010). Then, we obtain δˆθ = ˆθ1 ˆθ2. In all experiments, we draw n1 and n2 i.i.d samples from each Ising model by running Gibbs sampling. Here we set n = n1 = n2 = {20, 50, 100}.
Researcher Affiliation Academia Farideh Fazayeli FARIDEH@CS.UMN.EDU Arindam Banerjee BANERJEE@CS.UMN.EDU Department of Computer Science & Engineering, University of Minnesota, Twin Cities
Pseudocode Yes Algorithm 1 Generalized Direct Change Estimator
Open Source Code No The paper does not contain any explicit statement about releasing source code for the described methodology or a link to a code repository.
Open Datasets No In all experiments, we draw n1 and n2 i.i.d samples from each Ising model by running Gibbs sampling. Here we set n = n1 = n2 = {20, 50, 100}. This implies data was generated, but no access information is provided for a public dataset.
Dataset Splits No In all experiments, we draw n1 and n2 i.i.d samples from each Ising model by running Gibbs sampling. Here we set n = n1 = n2 = {20, 50, 100}. This describes sample generation for the models but does not specify a validation split.
Hardware Specification No The paper does not provide any specific details about the hardware used for the experiments.
Software Dependencies No The paper does not list specific software dependencies with version numbers (e.g., Python, PyTorch, or specific solver versions).
Experiment Setup Yes In all experiments, we draw n1 and n2 i.i.d samples from each Ising model by running Gibbs sampling. Here we set n = n1 = n2 = {20, 50, 100}. L1 norm: Here we first generate θ 1 with three disconnected star sub-graphs (Figure 4-a) with p = 50. We generate the weights uniformly random between {0.3 0.5}. We then generate θ 2 by removing 10 random edges from θ 1 (Figure 4-b).