Differentially private inference via noisy optimization
From MaRDI portal
Publication:6183772
Abstract: We propose a general optimization-based framework for computing differentially private M-estimators and a new method for constructing differentially private confidence regions. Firstly, we show that robust statistics can be used in conjunction with noisy gradient descent or noisy Newton methods in order to obtain optimal private estimators with global linear or quadratic convergence, respectively. We establish local and global convergence guarantees, under both local strong convexity and self-concordance, showing that our private estimators converge with high probability to a nearly optimal neighborhood of the non-private M-estimators. Secondly, we tackle the problem of parametric inference by constructing differentially private estimators of the asymptotic variance of our private M-estimators. This naturally leads to approximate pivotal statistics for constructing confidence regions and conducting hypothesis testing. We demonstrate the effectiveness of a bias correction that leads to enhanced small-sample empirical performance in simulations. We illustrate the benefits of our methods in several numerical examples.
Recommendations
Cites work
- scientific article; zbMATH DE number 3954047 (Why is no real title available?)
- scientific article; zbMATH DE number 2107836 (Why is no real title available?)
- scientific article; zbMATH DE number 3336465 (Why is no real title available?)
- A statistical framework for differential privacy
- Analyze Gauss: optimal bounds for privacy-preserving principal component analysis
- Composite convex optimization with global and local inexact oracles
- Convex optimization: algorithms and complexity
- Differentially Private Significance Tests for Regression Coefficients
- Differentially private empirical risk minimization
- Finite sample differentially private confidence intervals
- First-order methods of smooth convex optimization with inexact oracle
- Gaussian Differential Privacy
- Generalized self-concordant functions: a recipe for Newton-type methods
- Information-Theoretic Lower Bounds on the Oracle Complexity of Stochastic Convex Optimization
- Lectures on convex optimization
- Minimax Optimal Procedures for Locally Private Estimation
- On certain new notion of order Cauchy sequences, continuity in (l)-group
- Optimal Stochastic Approximation Algorithms for Strongly Convex Stochastic Composite Optimization I: A Generic Algorithmic Framework
- Privacy-preserving parametric inference: a case for robust statistics
- Private stochastic convex optimization: optimal rates in linear time
- Randomized response: a survey technique for eliminating evasive answer bias
- Regularized \(M\)-estimators with nonconvexity: statistical and algorithmic theory for local optima
- Robust Estimation of a Location Parameter
- Robust Statistics
- Smooth Optimization with Approximate Gradient
- Statistical consistency and asymptotic normality for high-dimensional robust \(M\)-estimators
- Stochastic Quasi-Newton Methods for Nonconvex Stochastic Optimization
- Sub-sampled Newton methods
- The algorithmic foundations of differential privacy
- The cost of privacy: optimal rates of convergence for parameter estimation with differential privacy
- What can we learn privately?
Cited in
(5)- Gaussian differentially private robust mean estimation and inference
- scientific article; zbMATH DE number 6860834 (Why is no real title available?)
- Canonical noise distributions and private hypothesis tests
- Geometrizing rates of convergence under local differential privacy constraints
- Edge differentially private estimation in the \(\beta\)-model via jittering and method of moments
This page was built for publication: Differentially private inference via noisy optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6183772)