Differentially private inference via noisy optimization
From MaRDI portal
Publication:6183772
DOI10.1214/23-AOS2321arXiv2103.11003OpenAlexW3136559452MaRDI QIDQ6183772FDOQ6183772
Authors: Marco Avella-Medina, Casey Bradshaw, Po-Ling Loh
Publication date: 4 January 2024
Published in: The Annals of Statistics (Search for Journal in Brave)
Abstract: We propose a general optimization-based framework for computing differentially private M-estimators and a new method for constructing differentially private confidence regions. Firstly, we show that robust statistics can be used in conjunction with noisy gradient descent or noisy Newton methods in order to obtain optimal private estimators with global linear or quadratic convergence, respectively. We establish local and global convergence guarantees, under both local strong convexity and self-concordance, showing that our private estimators converge with high probability to a nearly optimal neighborhood of the non-private M-estimators. Secondly, we tackle the problem of parametric inference by constructing differentially private estimators of the asymptotic variance of our private M-estimators. This naturally leads to approximate pivotal statistics for constructing confidence regions and conducting hypothesis testing. We demonstrate the effectiveness of a bias correction that leads to enhanced small-sample empirical performance in simulations. We illustrate the benefits of our methods in several numerical examples.
Full work available at URL: https://arxiv.org/abs/2103.11003
Recommendations
Cites Work
- Title not available (Why is that?)
- Robust Estimation of a Location Parameter
- Title not available (Why is that?)
- Title not available (Why is that?)
- Robust Statistics
- Statistical consistency and asymptotic normality for high-dimensional robust \(M\)-estimators
- Randomized response: a survey technique for eliminating evasive answer bias
- Smooth Optimization with Approximate Gradient
- First-order methods of smooth convex optimization with inexact oracle
- Differentially private empirical risk minimization
- Optimal Stochastic Approximation Algorithms for Strongly Convex Stochastic Composite Optimization I: A Generic Algorithmic Framework
- Regularized \(M\)-estimators with nonconvexity: statistical and algorithmic theory for local optima
- Convex optimization: algorithms and complexity
- What can we learn privately?
- Information-Theoretic Lower Bounds on the Oracle Complexity of Stochastic Convex Optimization
- The algorithmic foundations of differential privacy
- Lectures on convex optimization
- Minimax Optimal Procedures for Locally Private Estimation
- A statistical framework for differential privacy
- Stochastic Quasi-Newton Methods for Nonconvex Stochastic Optimization
- Sub-sampled Newton methods
- Composite convex optimization with global and local inexact oracles
- Generalized self-concordant functions: a recipe for Newton-type methods
- The cost of privacy: optimal rates of convergence for parameter estimation with differential privacy
- Finite sample differentially private confidence intervals
- Analyze Gauss: optimal bounds for privacy-preserving principal component analysis
- Privacy-preserving parametric inference: a case for robust statistics
- Gaussian Differential Privacy
- Differentially Private Significance Tests for Regression Coefficients
- Private stochastic convex optimization: optimal rates in linear time
- On certain new notion of order Cauchy sequences, continuity in (l)-group
Cited In (5)
- Gaussian differentially private robust mean estimation and inference
- Title not available (Why is that?)
- Canonical noise distributions and private hypothesis tests
- Geometrizing rates of convergence under local differential privacy constraints
- Edge differentially private estimation in the \(\beta\)-model via jittering and method of moments
This page was built for publication: Differentially private inference via noisy optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6183772)