Privately Learning Markov Random Fields
Authors: Huanyu Zhang, Gautam Kamath, Janardhan Kulkarni, Steven Wu
ICML 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | We consider the problem of learning Markov Random Fields (including the prototypical example, the Ising model) under the constraint of differential privacy. Our learning goals include both structure learning, where we try to estimate the underlying graph structure of the model, as well as the harder goal of parameter learning, in which we additionally estimate the parameter on each edge. We provide algorithms and lower bounds for both problems under a variety of privacy constraints namely pure, concentrated, and approximate differential privacy. |
| Researcher Affiliation | Collaboration | 1School of Electrical and Computer Engineering, Cornell University 2Cheriton School of Computer Science, University of Waterloo 3Microsoft Research Redmond 4Computer Science & Engineering, University of Minnesota. |
| Pseudocode | Yes | Algorithm 1 APFW (D, L, ρ, C) : Private FW Algorithm; Algorithm 2 Privately Learning Ising Models |
| Open Source Code | No | The paper does not provide any statement or link indicating the release of open-source code for the described methodology. |
| Open Datasets | No | The paper is theoretical and focuses on sample complexity and algorithmic design. It refers to 'samples' and 'data set D' in its algorithms and theorems, but it does not specify any particular publicly available or open datasets used for training or empirical evaluation. |
| Dataset Splits | No | The paper is theoretical and does not describe empirical experiments with dataset splits for training, validation, or testing. |
| Hardware Specification | No | The paper is theoretical and does not mention any specific hardware used for running experiments. |
| Software Dependencies | No | The paper is theoretical and does not specify any software dependencies with version numbers (e.g., Python, PyTorch versions) needed for replication. |
| Experiment Setup | No | The paper is theoretical and focuses on algorithms and sample complexity. It does not provide details about experimental setup, such as hyperparameters or training configurations. |