Statistical active learning algorithms for noise tolerance and differential privacy
From MaRDI portal
Publication:2345952
DOI10.1007/S00453-014-9954-9zbMATH Open1315.68202arXiv1307.3102OpenAlexW2047836642MaRDI QIDQ2345952FDOQ2345952
Authors: Maria-Florina Balcan, Vitaly Feldman
Publication date: 21 May 2015
Published in: Algorithmica (Search for Journal in Brave)
Abstract: We describe a framework for designing efficient active learning algorithms that are tolerant to random classification noise and are differentially-private. The framework is based on active learning algorithms that are statistical in the sense that they rely on estimates of expectations of functions of filtered random examples. It builds on the powerful statistical query framework of Kearns (1993). We show that any efficient active statistical learning algorithm can be automatically converted to an efficient active learning algorithm which is tolerant to random classification noise as well as other forms of "uncorrelated" noise. The complexity of the resulting algorithms has information-theoretically optimal quadratic dependence on , where is the noise rate. We show that commonly studied concept classes including thresholds, rectangles, and linear separators can be efficiently actively learned in our framework. These results combined with our generic conversion lead to the first computationally-efficient algorithms for actively learning some of these concept classes in the presence of random classification noise that provide exponential improvement in the dependence on the error over their passive counterparts. In addition, we show that our algorithms can be automatically converted to efficient active differentially-private algorithms. This leads to the first differentially-private active learning algorithms with exponential label savings over the passive case.
Full work available at URL: https://arxiv.org/abs/1307.3102
Recommendations
differential privacyactive learningcomputationally efficient algorithmsrandom classification noisestatistical query framework
Cites Work
- Title not available (Why is that?)
- Learnability and the Vapnik-Chervonenkis dimension
- Theory of Cryptography
- A theory of the learnable
- Theory of Classification: a Survey of Some Recent Advances
- Selective sampling using the query by committee algorithm
- Agnostic active learning
- Analysis of perceptron-based active learning
- Efficient noise-tolerant learning from statistical queries
- Selective sampling and active learning from single and multiple teachers
- Minimax Bounds for Active Learning
- Margin Based Active Learning
- The geometry of logconcave functions and sampling algorithms
- Statistical algorithms and a lower bound for detecting planted cliques
- Theory of Disagreement-Based Active Learning
- Weakly learning DNF and characterizing statistical query learning using Fourier analysis
- Learning noisy linear classifiers via adaptive and selective sampling
- A polynomial-time algorithm for learning noisy linear threshold functions
- Specification and simulation of statistical query algorithms for efficiency and noise tolerance
- What can we learn privately?
- A complete characterization of statistical query learning with applications to evolvability
- 10.1162/153244302760200669
- A simple polynomial-time rescaling algorithm for solving linear programs
- Rademacher complexities and bounding the excess risk in active learning
- Teaching Dimension and the Complexity of Active Learning
- Efficient active learning of halfspaces: an aggressive approach
- The true sample complexity of active learning
- Smoothness, disagreement coefficient, and the label complexity of agnostic active learning
Cited In (5)
Uses Software
This page was built for publication: Statistical active learning algorithms for noise tolerance and differential privacy
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2345952)