Pcps and the hardness of generating private synthetic data
From MaRDI portal
Publication:3000552
Recommendations
- PCPs and the hardness of generating synthetic data
- On the complexity of differentially private data release, efficient algorithms and hardness results
- A learning theory approach to noninteractive database privacy
- Faster private release of marginals on small databases
- Faster algorithms for privately releasing marginals
Cited in
(17)- What can we learn privately?
- Private measures, random walks, and synthetic data
- Segmentation, incentives, and privacy
- PCPs and the hardness of generating synthetic data
- Differential privacy on finite computers
- Separating computational and statistical differential privacy in the client-server model
- Strong hardness of privacy from weak traitor tracing
- Privacy preserving database generation for database application testing
- The complexity of differential privacy
- Fingerprinting codes and the price of approximate differential privacy
- Answering \(n^2+o(1)\) counting queries with differential privacy is hard
- Private Sampling: A Noiseless Approach for Generating Differentially Private Synthetic Data
- On the complexity of differentially private data release, efficient algorithms and hardness results
- Private data release via learning thresholds
- Covariance's loss is privacy's gain: computationally efficient, private and accurate synthetic data
- Order-revealing encryption and the hardness of private learning
- Efficient algorithms for privately releasing marginals via convex relaxations
This page was built for publication: Pcps and the hardness of generating private synthetic data
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3000552)