PCPs and the Hardness of Generating Private Synthetic Data
From MaRDI portal
Publication:3000552
DOI10.1007/978-3-642-19571-6_24zbMATH Open1295.94190OpenAlexW1587575659MaRDI QIDQ3000552FDOQ3000552
Publication date: 19 May 2011
Published in: Theory of Cryptography (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/978-3-642-19571-6_24
constraint satisfaction problemsdigital signaturesinapproximabilityprivacyprobabilistically checkable proofs
Cited In (14)
- Private Sampling: A Noiseless Approach for Generating Differentially Private Synthetic Data
- Efficient algorithms for privately releasing marginals via convex relaxations
- Title not available (Why is that?)
- The Complexity of Differential Privacy
- Fingerprinting Codes and the Price of Approximate Differential Privacy
- Differential Privacy on Finite Computers
- Answering \(n^2+o(1)\) counting queries with differential privacy is hard
- Order-Revealing Encryption and the Hardness of Private Learning
- Strong Hardness of Privacy from Weak Traitor Tracing
- Separating Computational and Statistical Differential Privacy in the Client-Server Model
- Covariance's loss is privacy's gain: computationally efficient, private and accurate synthetic data
- Segmentation, Incentives, and Privacy
- Private measures, random walks, and synthetic data
- What Can We Learn Privately?
This page was built for publication: PCPs and the Hardness of Generating Private Synthetic Data
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3000552)