Differentially private learning of geometric concepts
DOI10.1137/21M1406428zbMATH Open1499.68310arXiv1902.05017OpenAlexW2949290494WikidataQ114074075 ScholiaQ114074075MaRDI QIDQ5092508FDOQ5092508
Authors: Haim Kaplan, Yishay Mansour, Y. Matias, Uri Stemmer
Publication date: 22 July 2022
Published in: SIAM Journal on Computing (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1902.05017
Recommendations
Learning and adaptive systems in artificial intelligence (68T05) Computer graphics; computational geometry (digital and algorithmic aspects) (68U05) Privacy of data (68P27)
Cites Work
- Learnability and the Vapnik-Chervonenkis dimension
- Differentially private combinatorial optimization
- Theory of Cryptography
- Our Data, Ourselves: Privacy Via Distributed Noise Generation
- A theory of the learnable
- Efficient noise-tolerant learning from statistical queries
- The VC dimension of \(k\)-fold union
- Title not available (Why is that?)
- What can we learn privately?
- The algorithmic foundations of differential privacy
- Bounds on the sample complexity for private learning and private data release
- Order-revealing encryption and the hardness of private learning
- Differential privacy and robust statistics
- Characterizing the sample complexity of private learners
- Private Learning and Sanitization: Pure vs. Approximate Differential Privacy
- The complexity of differential privacy
- Simultaneous private learning of multiple concepts
- Sample complexity bounds on differentially private learning via communication complexity
- Private PAC learning implies finite Littlestone dimension
- Sample-efficient proper PAC learning with approximate differential privacy
- Learning Privately with Labeled and Unlabeled Examples
Cited In (4)
This page was built for publication: Differentially private learning of geometric concepts
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5092508)