Learnability and the Vapnik-Chervonenkis dimension

From MaRDI portal
Revision as of 21:26, 4 February 2024 by Import240129110113 (talk | contribs) (Created automatically from import240129110113)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:3474905

DOI10.1145/76359.76371zbMath0697.68079OpenAlexW2154952480WikidataQ56214749 ScholiaQ56214749MaRDI QIDQ3474905

David Haussler, Anselm Blumer, Andrzej Ehrenfeucht, Manfred K. Warmuth

Publication date: 1989

Published in: Journal of the ACM (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1145/76359.76371




Related Items (only showing first 100 items - show all)

On statistical learning of simplices: unmixing problem revisitedDynamic sizing of multilayer perceptronsEfficient distribution-free learning of probabilistic conceptsNonuniform learnabilityA review of combinatorial problems arising in feedforward neural network designOn universal learning algorithmsHalfspace learning, linear programming, and nonmalicious distributionsLearning Boolean concepts in the presence of many irrelevant featuresThe minimum feature set problemVapnik-Chervonenkis dimension and (pseudo-)hyperplane arrangementsLearning unions of high-dimensional boxes over the realsFirst-order \(jk\)-clausal theories are PAC-learnableToward efficient agnostic learningThe learnability of description logics with equality constraintsSphere packing numbers for subsets of the Boolean \(n\)-cube with bounded Vapnik-Chervonenkis dimensionThe VC dimension of \(k\)-fold unionAn algorithmic theory of learning: robust concepts and random projectionQuantifying inductive bias: AI learning algorithms and Valiant's learning frameworkOn ordinal VC-dimension and some notions of complexityFrom learning in the limit to stochastic finite learningLearning intersections and thresholds of halfspacesExploiting label dependencies for improved sample complexityVC dimensions of principal component analysisSelection of relevant features and examples in machine learningThe Perceptron algorithm versus Winnow: linear versus logarithmic mistake bounds when few input variables are relevantOn the generalization error of fixed combinations of classifiersOn the relative sizes of learnable setsDynamical recognizers: real-time language recognition by analog computersSupervised learning and co-trainingPAC learning, VC dimension, and the arithmetic hierarchyOn the hardness of learning intersections of two halfspacesMeta-interpretive learning of higher-order dyadic datalog: predicate invention revisitedRobust cutpoints in the logical analysis of numerical dataVC bounds on the cardinality of nearly orthogonal function classesGeneralization error bounds for the logical analysis of dataImproved approximation for guarding simple galleries from the perimeterOn learning a union of half spacesUsing the doubling dimension to analyze the generalization of learning algorithmsThe value of agreement a new boosting algorithmThe Vapnik-Chervonenkis dimension of decision trees with bounded rankEquivalence of models for polynomial learnabilityAlmost tight bounds for \(\epsilon\)-netsLearning in parallelComplexity of computing Vapnik-Chervonenkis dimension and some generalized dimensions\(\varepsilon\)-approximations of \(k\)-label spacesClassic learningNeural networks with quadratic VC dimensionOn the necessity of Occam algorithmsDecision theoretic generalizations of the PAC model for neural net and other learning applicationsThree \(\sum^ P_ 2\)-complete problems in computational learning theoryA machine discovery from amino acid sequences by decision trees over regular patternsReducing mechanism design to algorithm design via machine learningStructure identification in relational dataAggregate operators in constraint query languagesBounding sample size with the Vapnik-Chervonenkis dimensionPlaying monotone games to understand learning behaviorsAnalysis of a multi-category classifierLearning in the limit with lattice-structured hypothesis spacesMaximal pattern complexity, dual system and pattern recognition\(k\)-Fold unions of low-dimensional concept classesPartial observability and learnabilityFuzzy lattice reasoning (FLR) classifier and its application for ambient ozone estimationMulti-category classifiers and sample widthSharpening Occam's razorA notion of task relatedness yielding provable multiple-task learning guaranteesAn upper bound on the sample complexity of PAC-learning halfspaces with respect to the uniform distributionLearning commutative deterministic finite state automata in polynomial timeTeachability in computational learningCombinatorial variability of Vapnik-Chervonenkis classes with applications to sample compression schemesScale-sensitive dimensions and skeleton estimates for classificationVapnik-Chervonenkis dimension of recurrent neural networksThe degree of approximation of sets in euclidean space using sets with bounded Vapnik-Chervonenkis dimensionResults on learnability and the Vapnik-Chervonenkis dimensionGeneral bounds on statistical query learning and PAC learning with noise via hypothesis boostingOptimal mistake bound learning is hardPrediction-preserving reducibilityPrediction, learning, uniform convergence, and scale-sensitive dimensionsSpecification and simulation of statistical query algorithms for efficiency and noise toleranceLearning with unreliable boundary queriesLearning with restricted focus of attentionAn introduction to some statistical aspects of PAC learning theorySample size lower bounds in PAC learning by Algorithmic Complexity TheoryUsing a similarity measure for credible classificationApproximating hyper-rectangles: Learning and pseudorandom setsNew bounds on classical and quantum one-way communication complexityNoise-tolerant parallel learning of geometric conceptsMaximal width learning of binary functionsHitting sets when the VC-dimension is smallDeciding the Vapnik-Červonenkis dimension is \(\Sigma_3^p\)-completeOn the learnability of rich function classesCombinatorics and connectionismA result of Vapnik with applicationsDiscrepancy and approximations for bounded VC-dimensionLearnability with respect to fixed distributionsSimilarity, kernels, and the fundamental constraints on cognitionVersion spaces and the consistency problemBounding the vertex cover number of a hypergraphSparse approximate multiquadric interpolationOn learning embedded midbit functionsMaximizing agreements and coagnostic learning






This page was built for publication: Learnability and the Vapnik-Chervonenkis dimension