Generalized Precision Matrix for Scalable Estimation of Nonparametric Markov Networks

Authors: Yujia Zheng, Ignavier Ng, Yewen Fan, Kun Zhang

ICLR 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We validate the theoretical results and demonstrate the scalability empirically in various settings. and 4 EXPERIMENTS
Researcher Affiliation Academia 1 Carnegie Mellon University 2 Mohamed bin Zayed University of Artificial Intelligence
Pseudocode No The paper does not contain any structured pseudocode or algorithm blocks.
Open Source Code No The paper does not provide any statement or link indicating the release of open-source code for the described methodology.
Open Datasets Yes We conduct experiments on two sets of distributions: (1) Butterfly distributions (Morrison et al., 2017; Baptista et al., 2021) and (2) distributions from random graphs.
Dataset Splits No The paper mentions sample sizes (e.g., 'sample size of 1000' and '10000 samples') but does not specify the explicit percentages or counts for training, validation, or test dataset splits, nor does it mention cross-validation.
Hardware Specification Yes All experiments are on 12 CPU cores with 24 GB RAM.
Software Dependencies No The paper mentions using 'the deep kernel exponential family (DKEF)' and 'Adam optimizer' but does not provide specific version numbers for these software components or any other dependencies.
Experiment Setup No The paper states that the objective function is optimized by 'gradient descent with the Adam optimizer', but it does not provide specific experimental setup details such as learning rates, batch sizes, number of epochs, or other hyperparameter values for its model training.