PAC-learning in the presence of one-sided classification~noise
From MaRDI portal
Publication:2254605
DOI10.1007/s10472-012-9325-7zbMath1319.68122OpenAlexW2067542027MaRDI QIDQ2254605
Publication date: 5 February 2015
Published in: Annals of Mathematics and Artificial Intelligence (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10472-012-9325-7
Related Items (1)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Minimizing the error of linear separators on linearly inseparable data
- On multiple-instance learning of halfspaces
- Obtaining lower bounds using artificial components
- Decision theoretic generalizations of the PAC model for neural net and other learning applications
- Approximating hyper-rectangles: Learning and pseudorandom sets
- Toward efficient agnostic learning
- A note on learning from multiple-instance examples
- Solving the multiple instance problem with axis-parallel rectangles.
- A general lower bound on the number of examples needed for learning
- General bounds on the number of examples needed for learning probabilistic concepts
- Very simple classification rules perform well on most commonly used datasets
- Efficient noise-tolerant learning from statistical queries
- Learnability and the Vapnik-Chervonenkis dimension
- A theory of the learnable
- Lower bounds for algebraic decision trees
This page was built for publication: PAC-learning in the presence of one-sided classification~noise