On Differentially Private Graph Sparsification and Applications

Authors: Raman Arora, Jalaj Upadhyay

NeurIPS 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical In this paper, we study private sparsification of graphs. In particular, we give an algorithm that given an input graph, returns a sparse graph which approximates the spectrum of the input graph while ensuring differential privacy. This allows one to solve many graph problems privately yet efficiently and accurately. This is exemplified with application of the proposed meta-algorithm to graph algorithms for privately answering cut-queries, as well as practical algorithms for computing MAX-CUT and SPARSEST-CUT with better accuracy than previously known. We also give an efficient private algorithm to learn Laplacian eigenmap on a graph.
Researcher Affiliation Academia Raman Arora Johns Hopkins University arora@cs.jhu.edu Jalaj Upadhyay Rutgers University jalaj.kumar.upadhyay@gmail.com
Pseudocode Yes Algorithm 1 PRIVATE-SPARSIFY(G, ", ( , β))
Open Source Code No The paper does not provide any information or links regarding the availability of its source code.
Open Datasets No The paper is theoretical and does not describe experiments that involve training on specific datasets. Therefore, it does not provide information about public datasets or their access.
Dataset Splits No The paper is theoretical and does not describe empirical experiments involving dataset splits for training, validation, or testing.
Hardware Specification No The paper is theoretical and does not describe the hardware specifications used for any computational work.
Software Dependencies No The paper is theoretical and does not specify any software dependencies with version numbers required to replicate algorithms or results. It mentions solving semi-definite programs but no specific solver software.
Experiment Setup No The paper is theoretical and does not describe any experimental setup details, hyperparameters, or training configurations.