A general lower bound on the number of examples needed for learning

From MaRDI portal
Revision as of 09:51, 1 February 2024 by Import240129110113 (talk | contribs) (Created automatically from import240129110113)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:1823011

DOI10.1016/0890-5401(89)90002-3zbMath0679.68158OpenAlexW2045313701MaRDI QIDQ1823011

Andrzej Ehrenfeucht, Michael Kearns, David Haussler, Leslie G. Valiant

Publication date: 1989

Published in: Information and Computation (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1016/0890-5401(89)90002-3




Related Items (72)

Improving random forest algorithm by Lasso methodNonuniform learnabilityDeep learning: a statistical viewpointOn universal learning algorithmsLearning Boolean concepts in the presence of many irrelevant featuresOn the learnability of monotone \(k\mu\)-DNF formulae under product distributionsOn data classification by iterative linear partitioningSome connections between learning and optimizationLearning DNF in time \(2^{\widetilde O(n^{1/3})}\)Discriminative learning can succeed where generative learning failsClassification by polynomial surfacesA new PAC bound for intersection-closed concept classesBounding the Vapnik-Chervonenkis dimension of concept classes parameterized by real numbersUnnamed ItemQuantifying inductive bias: AI learning algorithms and Valiant's learning frameworkLearning distributions by their density levels: A paradigm for learning without a teacherLocalization of VC classes: beyond local Rademacher complexitiesOn the complexity of learning from drifting distributionsSupervised learning and co-trainingPAC learning of concept classes through the boundaries of their itemsNeural Nets with Superlinear VC-DimensionLearning reliably and with one-sided errorAn improved lower bound on query complexity for quantum PAC learningLearning figures with the Hausdorff metric by fractals -- towards computable binary classificationSubmodular goal value of Boolean functionsSmart PAC-learnersLearning under \(p\)-tampering poisoning attacksComplexity parameters for first order classesA Uniform Lower Error Bound for Half-Space LearningAn inequality for uniform deviations of sample averages from their meansQueries revisited.Using the doubling dimension to analyze the generalization of learning algorithmsLearning privately with labeled and unlabeled examplesInductive logic programmingThe Vapnik-Chervonenkis dimension of decision trees with bounded rank\(P\)-sufficient statistics for PAC learning \(k\)-term-DNF formulas through enumerationRigorous learning curve bounds from statistical mechanicsOn the necessity of Occam algorithmsDecision theoretic generalizations of the PAC model for neural net and other learning applicationsBounds on the sample complexity for private learning and private data releaseAspects of discrete mathematics and probability in the theory of machine learningPAC-learning in the presence of one-sided classification~noiseImproved bounds on quantum learning algorithmsGaining degrees of freedom in subsymbolic learningOn the Capabilities of Higher-Order Neurons: A Radial Basis Function ApproachBandwidth choice for nonparametric classification(Machine) learning parameter regionsPrecise induction from statistical dataTesting piecewise functionsComputational sample complexity and attribute-efficient learningRobust logicsBounding Embeddings of VC Classes into Maximum ClassesShifting: one-inclusion mistake bounds and sample compressionTheory of Classification: a Survey of Some Recent AdvancesSharpening Occam's razorResults on learnability and the Vapnik-Chervonenkis dimensionPrediction-preserving reducibilityPrediction, learning, uniform convergence, and scale-sensitive dimensionsSpecification and simulation of statistical query algorithms for efficiency and noise toleranceApproximating hyper-rectangles: Learning and pseudorandom setsSmart PAC-LearnersNoise-tolerant parallel learning of geometric conceptsBounds on the Sample Complexity for Private Learning and Private Data ReleaseConvexity and logical analysis of dataExplaining AI decisions using efficient methods for learning sparse Boolean formulaeOptimality of SVM: novel proofs and tighter boundsExact VC-dimension of Boolean monomialsUnnamed ItemImproved lower bounds for learning from noisy examples: An information-theoretic approachA general frmework for supervised learning. Probably almost Bayesian algorithmsA computational study on the performance of artificial neural networks under changing structural design and data distributionThe complexity of theory revision




Cites Work




This page was built for publication: A general lower bound on the number of examples needed for learning