On the privacy of noisy stochastic gradient descent for convex optimization
From MaRDI portal
Publication:6583673
Cites work
- Convex optimization: algorithms and complexity
- Differentially private empirical risk minimization
- High-dimensional Bayesian inference via the unadjusted Langevin algorithm
- Private selection from private candidates
- Private stochastic convex optimization: optimal rates in linear time
- Probability and computing. Randomization and probabilistic techniques in algorithms and data analysis
- Probability with Martingales
- Rényi Divergence and Kullback-Leibler Divergence
- Sampling from a log-concave distribution with projected Langevin Monte Carlo
- Theoretical Guarantees for Approximate Sampling from Smooth and Log-Concave Densities
- Theory of Cryptography
- What can we learn privately?
This page was built for publication: On the privacy of noisy stochastic gradient descent for convex optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6583673)