On the relation between differential privacy and quantitative information flow
From MaRDI portal
Publication:3012909
DOI10.1007/978-3-642-22012-8_4zbMATH Open1333.68088DBLPconf/icalp/AlvimACP11arXiv1109.6761OpenAlexW3101160956WikidataQ62043003 ScholiaQ62043003MaRDI QIDQ3012909FDOQ3012909
Authors: Mário S. Alvim, Miguel E. Andrés, Konstantinos Chatzikokolakis, Catuscia Palamidessi
Publication date: 7 July 2011
Published in: Automata, Languages and Programming (Search for Journal in Brave)
Abstract: Differential privacy is a notion that has emerged in the community of statistical databases, as a response to the problem of protecting the privacy of the database's participants when performing statistical queries. The idea is that a randomized query satisfies differential privacy if the likelihood of obtaining a certain answer for a database is not too different from the likelihood of obtaining the same answer on adjacent databases, i.e. databases which differ from for only one individual. Information flow is an area of Security concerned with the problem of controlling the leakage of confidential information in programs and protocols. Nowadays, one of the most established approaches to quantify and to reason about leakage is based on the R'enyi min entropy version of information theory. In this paper, we analyze critically the notion of differential privacy in light of the conceptual framework provided by the R'enyi min information theory. We show that there is a close relation between differential privacy and leakage, due to the graph symmetries induced by the adjacency relation. Furthermore, we consider the utility of the randomized answer, which measures its expected degree of accuracy. We focus on certain kinds of utility functions called "binary", which have a close correspondence with the R'enyi min mutual information. Again, it turns out that there can be a tight correspondence between differential privacy and utility, depending on the symmetries induced by the adjacency relation and by the query. Depending on these symmetries we can also build an optimal-utility randomization mechanism while preserving the required level of differential privacy. Our main contribution is a study of the kind of structures that can be induced by the adjacency relation and the query, and how to use them to derive bounds on the leakage and achieve the optimal utility.
Full work available at URL: https://arxiv.org/abs/1109.6761
Recommendations
Cites Work
- Differential Privacy
- Quantitative notions of leakage for one-try attacks
- Asymptotic information leakage under one-try attacks
- Computing the leakage of information-hiding systems
- On the Foundations of Quantitative Information Flow
- Quantification of integrity
- Differential privacy in new settings
- Compositional Methods for Information-Hiding
- Differential privacy and robust statistics
- Compositional closure for Bayes risk in probabilistic noninterference
- Universally utility-maximizing privacy mechanisms
Cited In (8)
- Title not available (Why is that?)
- Generalized differential privacy: regions of priors that admit robust optimal mechanisms
- Quantitative information flow and applications to differential privacy
- Worst- and average-case privacy breaches in randomization mechanisms
- Proving that programs are differentially private
- Worst- and average-case privacy breaches in randomization mechanisms
- Thermodynamic aspects of confidentiality
- Information-theoretic foundations of differential privacy
Uses Software
This page was built for publication: On the relation between differential privacy and quantitative information flow
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3012909)