Geometrizing rates of convergence under local differential privacy constraints
From MaRDI portal
(Redirected from Publication:2215754)
Abstract: We study the problem of estimating a functional of an unknown probability distribution in which the original iid sample is kept private even from the statistician via an -local differential privacy constraint. Let denote the modulus of continuity of the functional over , with respect to total variation distance. For a large class of loss functions and a fixed privacy level , we prove that the privatized minimax risk is equivalent to to within constants, under regularity conditions that are satisfied, in particular, if is linear and is convex. Our results complement the theory developed by Donoho and Liu (1991) with the nowadays highly relevant case of privatized data. Somewhat surprisingly, the difficulty of the estimation problem in the private case is characterized by , whereas, it is characterized by the Hellinger modulus of continuity if the original data are available. We also find that for locally private estimation of linear functionals over a convex model a simple sample mean estimator, based on independently and binary privatized observations, always achieves the minimax rate. We further provide a general recipe for choosing the functional parameter in the optimal binary privatization mechanisms and illustrate the general theory in numerous examples. Our theory allows to quantify the price to be paid for local differential privacy in a large class of estimation problems. This price appears to be highly problem specific.
Recommendations
- Privacy aware learning
- Optimal locally private estimation under \(\ell_p\) loss for \(1\le p\le 2\)
- Local differential privacy: elbow effect in optimal density estimation and adaptation over Besov ellipsoids
- Minimax Optimal Procedures for Locally Private Estimation
- Asymptotically Optimal and Private Statistical Estimation
- On density estimation at a fixed point under local differential privacy
- The cost of privacy: optimal rates of convergence for parameter estimation with differential privacy
- Differentially private inference via noisy optimization
- Interactive versus noninteractive locally differentially private estimation: two elbows for the quadratic functional
- Privacy-preserving statistical estimation with optimal convergence rates
Cites work
- scientific article; zbMATH DE number 3112287 (Why is no real title available?)
- A statistical framework for differential privacy
- Advances in Cryptology – CRYPTO 2004
- Convergence of estimates under dimensionality restrictions
- Differential Privacy: A Survey of Results
- Geometrizing rates of convergence. II
- Introduction to nonparametric estimation
- Minimax Optimal Procedures for Locally Private Estimation
- On general minimax theorems
- Optimale Tests und ungünstigste Verteilungen
- Privacy-preserving statistical estimation with optimal convergence rates
- Randomized response: a survey technique for eliminating evasive answer bias
- Testing Statistical Hypotheses
- The Optimal Noise-Adding Mechanism in Differential Privacy
- Theory of Cryptography
Cited in
(21)- Optimal locally private estimation under \(\ell_p\) loss for \(1\le p\le 2\)
- On lower bounds for the bias-variance trade-off
- The right complexity measure in locally private estimation: it is not the Fisher information
- On robustness and local differential privacy
- Constrained forms of statistical minimax: computation, communication, and privacy
- Distribution-invariant differential privacy
- Gaussian differentially private robust mean estimation and inference
- Exponential Separations in Local Privacy
- Multivariate density estimation from privatised data: universal consistency and minimax rates
- Phase transitions for support recovery under local differential privacy
- Wasserstein convergence in Bayesian and frequentist deconvolution models
- On density estimation at a fixed point under local differential privacy
- Strongly universally consistent nonparametric regression and classification with privatised data
- Local differential privacy: elbow effect in optimal density estimation and adaptation over Besov ellipsoids
- Nonparametric spectral density estimation under local differential privacy
- The cost of privacy: optimal rates of convergence for parameter estimation with differential privacy
- Efficiency in local differential privacy
- Density estimation under local differential privacy and Hellinger loss
- Interactive versus noninteractive locally differentially private estimation: two elbows for the quadratic functional
- Goodness-of-fit testing for Hölder continuous densities under local differential privacy
- Privacy aware learning
This page was built for publication: Geometrizing rates of convergence under local differential privacy constraints
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2215754)