The cost of privacy: optimal rates of convergence for parameter estimation with differential privacy
From MaRDI portal
Publication:2054532
DOI10.1214/21-AOS2058zbMath1486.62074arXiv1902.04495OpenAlexW3211357969MaRDI QIDQ2054532
Linjun Zhang, Yichen Wang, T. Tony Cai
Publication date: 3 December 2021
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1902.04495
Asymptotic properties of parametric estimators (62F12) Linear regression; mixed models (62J05) Parametric inference under constraints (62F30) Minimax procedures in statistical decision theory (62C20) Authentication, digital signatures and secret sharing (94A62)
Related Items
Private Sampling: A Noiseless Approach for Generating Differentially Private Synthetic Data, Interactive versus noninteractive locally differentially private estimation: two elbows for the quadratic functional, On robustness and local differential privacy, Differentially private inference via noisy optimization
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Iterative hard thresholding for compressed sensing
- Fast global convergence of gradient methods for high-dimensional statistical recovery
- Statistical estimation and optimal recovery
- On minimax estimation of a sparse normal mean vector
- Sparse spatial autoregressions
- Geometrizing rates of convergence under local differential privacy constraints
- What Can We Learn Privately?
- The Algorithmic Foundations of Differential Privacy
- High-Dimensional Statistics
- Minimax Optimal Procedures for Locally Private Estimation
- Finite Sample Differentially Private Confidence Intervals
- A Statistical Framework for Differential Privacy
- Fingerprinting codes and the price of approximate differential privacy
- Analyze gauss
- Privacy-preserving statistical estimation with optimal convergence rates
- Regularized M-estimators with nonconvexity: Statistical and algorithmic theory for local optima
- Theory of Cryptography
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers