Learning noisy linear classifiers via adaptive and selective sampling
From MaRDI portal
Publication:413852
DOI10.1007/s10994-010-5191-xzbMath1237.68139OpenAlexW2132162087WikidataQ59538566 ScholiaQ59538566MaRDI QIDQ413852
Claudio Gentile, Nicolò Cesa-Bianchi, Giovanni Cavallanti
Publication date: 8 May 2012
Published in: Machine Learning (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10994-010-5191-x
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Learning and adaptive systems in artificial intelligence (68T05)
Related Items
Integrated Fisher linear discriminants: an empirical study, Surrogate losses in passive and active learning, Unnamed Item, Statistical active learning algorithms for noise tolerance and differential privacy
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Fast rates for support vector machines using Gaussian kernels
- Selective sampling using the query by committee algorithm
- Queries revisited.
- Apple tasting.
- Optimal aggregation of classifiers in statistical learning.
- Statistical properties of kernel principal component analysis
- Rates of convergence in active learning
- Agnostic active learning
- Theory of Classification: a Survey of Some Recent Advances
- Active Learning in the Non-realizable Case
- On the Eigenspectrum of the Gram Matrix and the Generalization Error of Kernel-PCA
- Online Regularized Classification Algorithms
- Minimax Bounds for Active Learning
- A new technique of non‐linear statistic prediction and its application in atmospheric systems
- Competitive On-line Statistics
- A Second-Order Perceptron Algorithm
- Margin Based Active Learning
- Prediction, Learning, and Games
- Learning Theory
- Convexity, Classification, and Risk Bounds
- Relative loss bounds for on-line density estimation with the exponential family of distributions