Differentially private inference via noisy optimization

From MaRDI portal
Publication:6183772

DOI10.1214/23-AOS2321arXiv2103.11003OpenAlexW3136559452MaRDI QIDQ6183772FDOQ6183772


Authors: Marco Avella-Medina, Casey Bradshaw, Po-Ling Loh Edit this on Wikidata


Publication date: 4 January 2024

Published in: The Annals of Statistics (Search for Journal in Brave)

Abstract: We propose a general optimization-based framework for computing differentially private M-estimators and a new method for constructing differentially private confidence regions. Firstly, we show that robust statistics can be used in conjunction with noisy gradient descent or noisy Newton methods in order to obtain optimal private estimators with global linear or quadratic convergence, respectively. We establish local and global convergence guarantees, under both local strong convexity and self-concordance, showing that our private estimators converge with high probability to a nearly optimal neighborhood of the non-private M-estimators. Secondly, we tackle the problem of parametric inference by constructing differentially private estimators of the asymptotic variance of our private M-estimators. This naturally leads to approximate pivotal statistics for constructing confidence regions and conducting hypothesis testing. We demonstrate the effectiveness of a bias correction that leads to enhanced small-sample empirical performance in simulations. We illustrate the benefits of our methods in several numerical examples.


Full work available at URL: https://arxiv.org/abs/2103.11003




Recommendations




Cites Work


Cited In (5)





This page was built for publication: Differentially private inference via noisy optimization

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6183772)