Learnability and the Vapnik-Chervonenkis dimension

From MaRDI portal
Revision as of 21:26, 4 February 2024 by Import240129110113 (talk | contribs) (Created automatically from import240129110113)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:3474905

DOI10.1145/76359.76371zbMath0697.68079OpenAlexW2154952480WikidataQ56214749 ScholiaQ56214749MaRDI QIDQ3474905

David Haussler, Anselm Blumer, Andrzej Ehrenfeucht, Manfred K. Warmuth

Publication date: 1989

Published in: Journal of the ACM (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1145/76359.76371




Related Items (only showing first 100 items - show all)

A linear lower bound on the unbounded error probabilistic communication complexity.Learning from rounded-off data.Advanced elementary formal systems.Intrinsic complexity of learning geometrical concepts from positive dataA Sauer-Shelah-Perles lemma for sumsetsOn data classification by iterative linear partitioningSome connections between learning and optimizationEfficient algorithms for learning functions with bounded variationLearning DNF in time \(2^{\widetilde O(n^{1/3})}\)Concentration inequalities for non-causal random fieldsRotEqNet: rotation-equivariant network for fluid systems with symmetric high-order tensorsVC-saturated set systemsCan PAC learning algorithms tolerate random attribute noise?On the Vapnik-Chervonenkis dimension of computer programs which use transcendental elementary operationsNew bounds on the minimal dispersionMinimal dispersion of large volume boxes in the cubeTesting nonlinear operatorsA counterexample concerning uniform ergodic theorems for a class of functionsClassification by polynomial surfacesOn specifying Boolean functions by labelled examplesOn PAC learning algorithms for rich Boolean function classesLarge width nearest prototype classification on general distance spacesFusion methods for multiple sensor systems with unknown error densitiesA generalization of Sauer's lemmaA new PAC bound for intersection-closed concept classesVC-dimensions of finite automata and commutative finite automata with \(k\) letters and \(n\) statesLearning an intersection of a constant number of halfspaces over a uniform distributionBounding the Vapnik-Chervonenkis dimension of concept classes parameterized by real numbersInductive inference in the limit of empirically adequate theoriesOn fixed-parameter tractability and approximability of NP optimization problemsAlmost optimal set covers in finite VC-dimensionThe VC-dimension of set systems defined by graphsA sufficient condition for polynomial distribution-dependent learnabilityImproved upper bounds for probabilities of uniform deviationsOn the VC-dimension of depth four threshold circuits and the complexity of Boolean-valued functionsAn approach to guided learning of Boolean functionsApproximation and learning of convex superpositionsLearning distributions by their density levels: A paradigm for learning without a teacherLearning recursive functions from approximationsThe \(\varepsilon\)-\(t\)-net problemOn the complexity of learning from drifting distributionsThe bounded injury priority method and the learnability of unions of rectanglesClassification based on prototypes with spheres of influenceA theory of formal synthesis via inductive learningOn the value of partial information for learning from examplesEfficient learning with virtual threshold gatesAn efficient membership-query algorithm for learning DNF with respect to the uniform distributionApproximate testing and its relationship to learningA framework for incremental learning of logic programsPAC-learning from general examplesPAC learning of concept classes through the boundaries of their itemsDecomposing arrangements of hyperplanes: VC-dimension, combinatorial dimension, and point locationPermissive planning: Extending classical planning to uncertain task domains.On the difficulty of approximately maximizing agreements.An improved lower bound on query complexity for quantum PAC learningLearning figures with the Hausdorff metric by fractals -- towards computable binary classificationLearning under \(p\)-tampering poisoning attacksBounds on the minimax rate for estimating a prior over a VC class from independent learning tasksOn the dispersion of sparse gridsA Sauer-Shelah-Perles lemma for latticesThe true sample complexity of active learningAccuracy of techniques for the logical analysis of dataThe monotone theory for the PAC-model.Queries revisited.On learning multicategory classification with sample queries.On weak \(\epsilon\)-nets and the Radon numberLearning privately with labeled and unlabeled examplesThe minimal \(k\)-dispersion of point sets in high dimensions\(P\)-sufficient statistics for PAC learning \(k\)-term-DNF formulas through enumerationLarge-width bounds for learning half-spaces on distance spacesOptimization approaches to supervised classificationMeta-interpretive learning: application to grammatical inferenceBounds on the sample complexity for private learning and private data releasePAC-learning in the presence of one-sided classification~noise(Machine) learning parameter regionsTesting piecewise functionsThe complexity of exact learning of acyclic conditional preference networks from swap examplesThe power of random counterexamplesThe learnability of unions of two rectangles in the two-dimensional discretized spacePractical issues in temporal difference learningNew applications of random sampling in computational geometryLearning decision trees from random examplesA general lower bound on the number of examples needed for learningAn average-case optimal one-variable pattern language learnerProper learning of \(k\)-term DNF formulas from satisfying assignmentsConvexity and logical analysis of dataMean estimation and regression under heavy-tailed distributions: A surveyThe Vapnik-Chervonenkis dimension of a random graphExact VC-dimension of Boolean monomialsAn \(O(\lg \lg {\mathrm {OPT}})\)-approximation algorithm for multi-guarding galleriesOn computing the diameter of a point set in high dimensional Euclidean space.PAC learning with nasty noise.On the complexity of learning for spiking neurons with temporal coding.Improved lower bounds for learning from noisy examples: An information-theoretic approachLearning fixed-dimension linear thresholds from fragmented dataQuantum learning of concentrated Boolean functionsStatistical active learning algorithms for noise tolerance and differential privacyTransfer theorems via sign conditionsA new abstract combinatorial dimension for exact learning via queriesThe complexity of theory revision







This page was built for publication: Learnability and the Vapnik-Chervonenkis dimension