Private Graphon Estimation for Sparse Graphs

Authors: Christian Borgs, Jennifer Chayes, Adam Smith

NeurIPS 2015 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical We design algorithms for fitting a high-dimensional statistical model to a large, sparse network without revealing sensitive information of individual members... We give explicit error bounds in terms of the parameters of the model; in several settings, our bounds improve on or match known nonprivate results. Our main concentration statement is captured in the following proposition.
Researcher Affiliation Collaboration Christian Borgs Jennifer T. Chayes Microsoft Research New England Cambridge, MA, USA. {cborgs,jchayes}@microsoft.com Adam Smith Pennsylvania State University University Park, PA, USA. asmith@psu.edu
Pseudocode Yes Algorithm 1: Private Estimation Algorithm Input: ϵ > 0, λ 1, an integer k and graph G on n vertices. Output: k k block graphon (represented as a k k matrix ˆB) estimating ρW Compute an (ϵ/2)-node-private density approximation ˆρ = ρ(G) + Lap(4/nϵ) ; d = λˆρn (the target maximum degree) ; µ = λˆρ (the target L norm for ˆB) ; For each B and π, let [ score(B, π; ) denote a nondecreasing Lipschitz extension (from [21]) of score(B, π; ) from Gn,d to Gn such that for all matrices A, [ score(B, π; A) score(B, π; A), and define [ score(B; A) = max π [ score(B, π; A) return ˆB, sampled from the distribution Pr( ˆB = B) exp ϵ 4 [ score(B; A) , where = 4dµ n2 = 4λ2ˆρ2 n and B ranges over matrices in Bµ = {B [0, µ]k k : all entries Bi,j are multiples of 1
Open Source Code No The paper does not provide a specific repository link, an explicit statement about code release, or mention code in supplementary materials for the methodology described. It only provides a link to the full version of the extended abstract at http://arxiv.org/abs/1506.06162.
Open Datasets No This is a theoretical paper that proposes and analyzes algorithms for statistical models. It does not describe experiments run on specific datasets, but rather uses abstract graph models (e.g., 'input graph G', 'Gn(ρW)') for theoretical analysis.
Dataset Splits No This is a theoretical paper that does not describe experiments involving dataset splits (training, validation, or testing).
Hardware Specification No This is a theoretical paper that does not describe conducting experiments, and therefore no hardware specifications are provided.
Software Dependencies No This is a theoretical paper that does not describe conducting experiments or implementations requiring specific software dependencies with version numbers.
Experiment Setup No This is a theoretical paper that designs and analyzes algorithms. It does not describe an experimental setup, hyperparameters, or training settings for practical implementation.