A general lower bound on the number of examples needed for learning
From MaRDI portal
Publication:1823011
DOI10.1016/0890-5401(89)90002-3zbMath0679.68158OpenAlexW2045313701MaRDI QIDQ1823011
Andrzej Ehrenfeucht, Michael Kearns, David Haussler, Leslie G. Valiant
Publication date: 1989
Published in: Information and Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/0890-5401(89)90002-3
Analysis of algorithms and problem complexity (68Q25) Learning and adaptive systems in artificial intelligence (68T05)
Related Items
Improving random forest algorithm by Lasso method, Nonuniform learnability, Deep learning: a statistical viewpoint, On universal learning algorithms, Learning Boolean concepts in the presence of many irrelevant features, On the learnability of monotone \(k\mu\)-DNF formulae under product distributions, On data classification by iterative linear partitioning, Some connections between learning and optimization, Learning DNF in time \(2^{\widetilde O(n^{1/3})}\), Discriminative learning can succeed where generative learning fails, Classification by polynomial surfaces, A new PAC bound for intersection-closed concept classes, Bounding the Vapnik-Chervonenkis dimension of concept classes parameterized by real numbers, Unnamed Item, Quantifying inductive bias: AI learning algorithms and Valiant's learning framework, Learning distributions by their density levels: A paradigm for learning without a teacher, Localization of VC classes: beyond local Rademacher complexities, On the complexity of learning from drifting distributions, Supervised learning and co-training, PAC learning of concept classes through the boundaries of their items, Neural Nets with Superlinear VC-Dimension, Learning reliably and with one-sided error, An improved lower bound on query complexity for quantum PAC learning, Learning figures with the Hausdorff metric by fractals -- towards computable binary classification, Submodular goal value of Boolean functions, Smart PAC-learners, Learning under \(p\)-tampering poisoning attacks, Complexity parameters for first order classes, A Uniform Lower Error Bound for Half-Space Learning, An inequality for uniform deviations of sample averages from their means, Queries revisited., Using the doubling dimension to analyze the generalization of learning algorithms, Learning privately with labeled and unlabeled examples, Inductive logic programming, The Vapnik-Chervonenkis dimension of decision trees with bounded rank, \(P\)-sufficient statistics for PAC learning \(k\)-term-DNF formulas through enumeration, Rigorous learning curve bounds from statistical mechanics, On the necessity of Occam algorithms, Decision theoretic generalizations of the PAC model for neural net and other learning applications, Bounds on the sample complexity for private learning and private data release, Aspects of discrete mathematics and probability in the theory of machine learning, PAC-learning in the presence of one-sided classification~noise, Improved bounds on quantum learning algorithms, Gaining degrees of freedom in subsymbolic learning, On the Capabilities of Higher-Order Neurons: A Radial Basis Function Approach, Bandwidth choice for nonparametric classification, (Machine) learning parameter regions, Precise induction from statistical data, Testing piecewise functions, Computational sample complexity and attribute-efficient learning, Robust logics, Bounding Embeddings of VC Classes into Maximum Classes, Shifting: one-inclusion mistake bounds and sample compression, Theory of Classification: a Survey of Some Recent Advances, Sharpening Occam's razor, Results on learnability and the Vapnik-Chervonenkis dimension, Prediction-preserving reducibility, Prediction, learning, uniform convergence, and scale-sensitive dimensions, Specification and simulation of statistical query algorithms for efficiency and noise tolerance, Approximating hyper-rectangles: Learning and pseudorandom sets, Smart PAC-Learners, Noise-tolerant parallel learning of geometric concepts, Bounds on the Sample Complexity for Private Learning and Private Data Release, Convexity and logical analysis of data, Explaining AI decisions using efficient methods for learning sparse Boolean formulae, Optimality of SVM: novel proofs and tighter bounds, Exact VC-dimension of Boolean monomials, Unnamed Item, Improved lower bounds for learning from noisy examples: An information-theoretic approach, A general frmework for supervised learning. Probably almost Bayesian algorithms, A computational study on the performance of artificial neural networks under changing structural design and data distribution, The complexity of theory revision
Cites Work
- Unnamed Item
- Results on learnability and the Vapnik-Chervonenkis dimension
- A new polynomial-time algorithm for linear programming
- Learnability with respect to fixed distributions
- \(\epsilon\)-nets and simplex range queries
- Quantifying inductive bias: AI learning algorithms and Valiant's learning framework
- Occam's razor
- Fast probabilistic algorithms for Hamiltonian circuits and matchings
- Some special Vapnik-Chervonenkis classes
- Estimation of dependences based on empirical data. Transl. from the Russian by Samuel Kotz
- Predicting \(\{ 0,1\}\)-functions on randomly drawn points
- Queries and concept learning
- Learnability and the Vapnik-Chervonenkis dimension
- A theory of the learnable
- Computational limitations on learning from examples
- On the Uniform Convergence of Relative Frequencies of Events to Their Probabilities
- A Measure of Asymptotic Efficiency for Tests of a Hypothesis Based on the sum of Observations