Pcps and the hardness of generating private synthetic data
From MaRDI portal
Publication:3000552
DOI10.1007/978-3-642-19571-6_24zbMATH Open1295.94190OpenAlexW1587575659MaRDI QIDQ3000552FDOQ3000552
Authors: Jonathan Ullman, Salil Vadhan
Publication date: 19 May 2011
Published in: Theory of Cryptography (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/978-3-642-19571-6_24
Recommendations
- PCPs and the hardness of generating synthetic data
- On the complexity of differentially private data release, efficient algorithms and hardness results
- A learning theory approach to noninteractive database privacy
- Faster private release of marginals on small databases
- Faster algorithms for privately releasing marginals
constraint satisfaction problemsdigital signaturesinapproximabilityprivacyprobabilistically checkable proofs
Cited In (15)
- Privacy preserving database generation for database application testing
- Private Sampling: A Noiseless Approach for Generating Differentially Private Synthetic Data
- Efficient algorithms for privately releasing marginals via convex relaxations
- Title not available (Why is that?)
- The Complexity of Differential Privacy
- Fingerprinting Codes and the Price of Approximate Differential Privacy
- Differential Privacy on Finite Computers
- Answering \(n^2+o(1)\) counting queries with differential privacy is hard
- Order-Revealing Encryption and the Hardness of Private Learning
- Strong Hardness of Privacy from Weak Traitor Tracing
- What can we learn privately?
- Separating Computational and Statistical Differential Privacy in the Client-Server Model
- Covariance's loss is privacy's gain: computationally efficient, private and accurate synthetic data
- Segmentation, Incentives, and Privacy
- Private measures, random walks, and synthetic data
This page was built for publication: Pcps and the hardness of generating private synthetic data
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3000552)