Learnability and the Vapnik-Chervonenkis dimension

From MaRDI portal
Publication:3474905

DOI10.1145/76359.76371zbMath0697.68079OpenAlexW2154952480WikidataQ56214749 ScholiaQ56214749MaRDI QIDQ3474905

David Haussler, Anselm Blumer, Andrzej Ehrenfeucht, Manfred K. Warmuth

Publication date: 1989

Published in: Journal of the ACM (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1145/76359.76371



Related Items

On statistical learning of simplices: unmixing problem revisited, Dynamic sizing of multilayer perceptrons, Efficient distribution-free learning of probabilistic concepts, Nonuniform learnability, A review of combinatorial problems arising in feedforward neural network design, On universal learning algorithms, Halfspace learning, linear programming, and nonmalicious distributions, Learning Boolean concepts in the presence of many irrelevant features, The minimum feature set problem, Vapnik-Chervonenkis dimension and (pseudo-)hyperplane arrangements, Learning unions of high-dimensional boxes over the reals, First-order \(jk\)-clausal theories are PAC-learnable, Toward efficient agnostic learning, The learnability of description logics with equality constraints, Sphere packing numbers for subsets of the Boolean \(n\)-cube with bounded Vapnik-Chervonenkis dimension, The VC dimension of \(k\)-fold union, An algorithmic theory of learning: robust concepts and random projection, Quantifying inductive bias: AI learning algorithms and Valiant's learning framework, On ordinal VC-dimension and some notions of complexity, From learning in the limit to stochastic finite learning, Learning intersections and thresholds of halfspaces, Exploiting label dependencies for improved sample complexity, VC dimensions of principal component analysis, Selection of relevant features and examples in machine learning, The Perceptron algorithm versus Winnow: linear versus logarithmic mistake bounds when few input variables are relevant, On the generalization error of fixed combinations of classifiers, On the relative sizes of learnable sets, Dynamical recognizers: real-time language recognition by analog computers, Supervised learning and co-training, PAC learning, VC dimension, and the arithmetic hierarchy, On the hardness of learning intersections of two halfspaces, Meta-interpretive learning of higher-order dyadic datalog: predicate invention revisited, Robust cutpoints in the logical analysis of numerical data, VC bounds on the cardinality of nearly orthogonal function classes, Generalization error bounds for the logical analysis of data, Improved approximation for guarding simple galleries from the perimeter, On learning a union of half spaces, Using the doubling dimension to analyze the generalization of learning algorithms, The value of agreement a new boosting algorithm, The Vapnik-Chervonenkis dimension of decision trees with bounded rank, Equivalence of models for polynomial learnability, Almost tight bounds for \(\epsilon\)-nets, Learning in parallel, Complexity of computing Vapnik-Chervonenkis dimension and some generalized dimensions, \(\varepsilon\)-approximations of \(k\)-label spaces, Classic learning, Neural networks with quadratic VC dimension, On the necessity of Occam algorithms, Decision theoretic generalizations of the PAC model for neural net and other learning applications, Three \(\sum^ P_ 2\)-complete problems in computational learning theory, A machine discovery from amino acid sequences by decision trees over regular patterns, Reducing mechanism design to algorithm design via machine learning, Structure identification in relational data, Aggregate operators in constraint query languages, Bounding sample size with the Vapnik-Chervonenkis dimension, Playing monotone games to understand learning behaviors, Analysis of a multi-category classifier, Learning in the limit with lattice-structured hypothesis spaces, Maximal pattern complexity, dual system and pattern recognition, \(k\)-Fold unions of low-dimensional concept classes, Partial observability and learnability, Fuzzy lattice reasoning (FLR) classifier and its application for ambient ozone estimation, Multi-category classifiers and sample width, Sharpening Occam's razor, A notion of task relatedness yielding provable multiple-task learning guarantees, An upper bound on the sample complexity of PAC-learning halfspaces with respect to the uniform distribution, Learning commutative deterministic finite state automata in polynomial time, Teachability in computational learning, Combinatorial variability of Vapnik-Chervonenkis classes with applications to sample compression schemes, Scale-sensitive dimensions and skeleton estimates for classification, Vapnik-Chervonenkis dimension of recurrent neural networks, The degree of approximation of sets in euclidean space using sets with bounded Vapnik-Chervonenkis dimension, Results on learnability and the Vapnik-Chervonenkis dimension, General bounds on statistical query learning and PAC learning with noise via hypothesis boosting, Optimal mistake bound learning is hard, Prediction-preserving reducibility, Prediction, learning, uniform convergence, and scale-sensitive dimensions, Specification and simulation of statistical query algorithms for efficiency and noise tolerance, Learning with unreliable boundary queries, Learning with restricted focus of attention, An introduction to some statistical aspects of PAC learning theory, Sample size lower bounds in PAC learning by Algorithmic Complexity Theory, Using a similarity measure for credible classification, Approximating hyper-rectangles: Learning and pseudorandom sets, New bounds on classical and quantum one-way communication complexity, Noise-tolerant parallel learning of geometric concepts, Maximal width learning of binary functions, Hitting sets when the VC-dimension is small, Deciding the Vapnik-Červonenkis dimension is \(\Sigma_3^p\)-complete, On the learnability of rich function classes, Combinatorics and connectionism, A result of Vapnik with applications, Discrepancy and approximations for bounded VC-dimension, Learnability with respect to fixed distributions, Similarity, kernels, and the fundamental constraints on cognition, Version spaces and the consistency problem, Bounding the vertex cover number of a hypergraph, Sparse approximate multiquadric interpolation, On learning embedded midbit functions, Maximizing agreements and coagnostic learning, A linear lower bound on the unbounded error probabilistic communication complexity., Learning from rounded-off data., Advanced elementary formal systems., Intrinsic complexity of learning geometrical concepts from positive data, A Sauer-Shelah-Perles lemma for sumsets, On data classification by iterative linear partitioning, Some connections between learning and optimization, Efficient algorithms for learning functions with bounded variation, Learning DNF in time \(2^{\widetilde O(n^{1/3})}\), Concentration inequalities for non-causal random fields, RotEqNet: rotation-equivariant network for fluid systems with symmetric high-order tensors, VC-saturated set systems, Can PAC learning algorithms tolerate random attribute noise?, On the Vapnik-Chervonenkis dimension of computer programs which use transcendental elementary operations, New bounds on the minimal dispersion, Minimal dispersion of large volume boxes in the cube, Testing nonlinear operators, A counterexample concerning uniform ergodic theorems for a class of functions, Classification by polynomial surfaces, On specifying Boolean functions by labelled examples, On PAC learning algorithms for rich Boolean function classes, Large width nearest prototype classification on general distance spaces, Fusion methods for multiple sensor systems with unknown error densities, A generalization of Sauer's lemma, A new PAC bound for intersection-closed concept classes, VC-dimensions of finite automata and commutative finite automata with \(k\) letters and \(n\) states, Learning an intersection of a constant number of halfspaces over a uniform distribution, Bounding the Vapnik-Chervonenkis dimension of concept classes parameterized by real numbers, Inductive inference in the limit of empirically adequate theories, On fixed-parameter tractability and approximability of NP optimization problems, Almost optimal set covers in finite VC-dimension, The VC-dimension of set systems defined by graphs, A sufficient condition for polynomial distribution-dependent learnability, Improved upper bounds for probabilities of uniform deviations, On the VC-dimension of depth four threshold circuits and the complexity of Boolean-valued functions, An approach to guided learning of Boolean functions, Approximation and learning of convex superpositions, Learning distributions by their density levels: A paradigm for learning without a teacher, Learning recursive functions from approximations, The \(\varepsilon\)-\(t\)-net problem, On the complexity of learning from drifting distributions, The bounded injury priority method and the learnability of unions of rectangles, Classification based on prototypes with spheres of influence, A theory of formal synthesis via inductive learning, On the value of partial information for learning from examples, Efficient learning with virtual threshold gates, An efficient membership-query algorithm for learning DNF with respect to the uniform distribution, Approximate testing and its relationship to learning, A framework for incremental learning of logic programs, PAC-learning from general examples, PAC learning of concept classes through the boundaries of their items, Decomposing arrangements of hyperplanes: VC-dimension, combinatorial dimension, and point location, Permissive planning: Extending classical planning to uncertain task domains., On the difficulty of approximately maximizing agreements., An improved lower bound on query complexity for quantum PAC learning, Learning figures with the Hausdorff metric by fractals -- towards computable binary classification, Learning under \(p\)-tampering poisoning attacks, Bounds on the minimax rate for estimating a prior over a VC class from independent learning tasks, On the dispersion of sparse grids, A Sauer-Shelah-Perles lemma for lattices, The true sample complexity of active learning, Accuracy of techniques for the logical analysis of data, The monotone theory for the PAC-model., Queries revisited., On learning multicategory classification with sample queries., On weak \(\epsilon\)-nets and the Radon number, Learning privately with labeled and unlabeled examples, The minimal \(k\)-dispersion of point sets in high dimensions, \(P\)-sufficient statistics for PAC learning \(k\)-term-DNF formulas through enumeration, Large-width bounds for learning half-spaces on distance spaces, Optimization approaches to supervised classification, Meta-interpretive learning: application to grammatical inference, Bounds on the sample complexity for private learning and private data release, PAC-learning in the presence of one-sided classification~noise, (Machine) learning parameter regions, Testing piecewise functions, The complexity of exact learning of acyclic conditional preference networks from swap examples, The power of random counterexamples, The learnability of unions of two rectangles in the two-dimensional discretized space, Practical issues in temporal difference learning, New applications of random sampling in computational geometry, Learning decision trees from random examples, A general lower bound on the number of examples needed for learning, An average-case optimal one-variable pattern language learner, Proper learning of \(k\)-term DNF formulas from satisfying assignments, Convexity and logical analysis of data, Mean estimation and regression under heavy-tailed distributions: A survey, The Vapnik-Chervonenkis dimension of a random graph, Exact VC-dimension of Boolean monomials, An \(O(\lg \lg {\mathrm {OPT}})\)-approximation algorithm for multi-guarding galleries, On computing the diameter of a point set in high dimensional Euclidean space., PAC learning with nasty noise., On the complexity of learning for spiking neurons with temporal coding., Improved lower bounds for learning from noisy examples: An information-theoretic approach, Learning fixed-dimension linear thresholds from fragmented data, Quantum learning of concentrated Boolean functions, Statistical active learning algorithms for noise tolerance and differential privacy, Transfer theorems via sign conditions, A new abstract combinatorial dimension for exact learning via queries, The complexity of theory revision, Learning with Limited Samples: Meta-Learning and Applications to Communication Systems, Deep learning: a statistical viewpoint, DNA sequencing and string learning, What Circuit Classes Can Be Learned with Non-Trivial Savings?, Quasi-Polynomial Time Approximation Scheme for Weighted Geometric Set Cover on Pseudodisks and Halfspaces, Bloom Filters in Adversarial Environments, Revisiting Shinohara's algorithm for computing descriptive patterns, On the geometry of polytopes generated by heavy-tailed random vectors, Unnamed Item, Unnamed Item, Using the Perceptron Algorithm to Find Consistent Hypotheses, Ten More Years of Error Rate Research, A framework for polynomial-time query learnability, The VC-dimension of K-vertex D-polytopes, General Feasibility Bounds for Sample Average Approximation via Vapnik--Chervonenkis Dimension, Fast sequential and parallel algorithms for finding the largest rectangle separating two sets, The VC dimension of metric balls under Fréchet and Hausdorff distances, Differentially Private Learning of Geometric Concepts, On the VC-Dimension of Binary Codes, Incremental learning of logic programs, From undecidability of non-triviality and finiteness to undecidability of learnability, On the non-efficient PAC learnability of conjunctive queries, An optimal algorithm for proper learning of unions of two rectangles with queries, On multivariate randomized classification trees: \(l_0\)-based sparsity, VC dimension and decomposition methods, Beyond NP: Quantifying over Answer Sets, VC-dimensions for graphs (extended abstract), The learnability of quantum states, Learning reliably and with one-sided error, Learning programs with magic values, Learning half-spaces on general infinite spaces equipped with a distance function, Compression schemes for concept classes induced by three types of discrete undirected graphical models, Efficiency in the Identification in the Limit Learning Paradigm, The minimal spherical dispersion, Rates of convergence in active learning, Neural Networks with Local Receptive Fields and Superlinear VC Dimension, A theoretical framework for deep transfer learning, Smart PAC-learners, Teaching and Compressing for Low VC-Dimension, Complexity parameters for first order classes, An Upper Bound of the Minimal Dispersion via Delta Covers, Learning bounds via sample width for classifiers on finite metric spaces, A hybrid classifier based on boxes and nearest neighbors, Core-Sets: Updated Survey, Learning intersections of halfspaces with a margin, Turning Big Data Into Tiny Data: Constant-Size Coresets for $k$-Means, PCA, and Projective Clustering, Application of a Generalization of Russo's Formula to Learning from Multiple Random Oracles, A data and digital-contracts driven method for pricing complex derivatives, Some combinatorics of imperfect information, Aspects of discrete mathematics and probability in the theory of machine learning, Explanatory and creative alternatives to the MDL principle, A lower bound for families of Natarajan dimension \(d\), Optimal schedules for monitoring anytime algorithms, Gaining degrees of freedom in subsymbolic learning, Learning logic programs with structured background knowledge, Ambiguous chance constrained problems and robust optimization, Agnostic learning of geometric patterns, Empirical minimization, Polynomial certificates for propositional classes, Training a Single Sigmoidal Neuron Is Hard, On the Complexity of Computing and Learning with Multiplicative Neural Networks, An algorithmic theory of learning: Robust concepts and random projection, Unnamed Item, Unnamed Item, Unnamed Item, Unnamed Item, Unnamed Item, Precise induction from statistical data, Learning automata algorithms for pattern classification., Determination and the No-Free-Lunch Paradox, Pac-learning non-recursive Prolog clauses, Tractability of the approximation of high-dimensional rank one tensors, The optimal PAC bound for intersection-closed concept classes, Order compression schemes, Pac-learning non-recursive Prolog clauses, On the Nonlearnability of a Single Spiking Neuron, Bounding Embeddings of VC Classes into Maximum Classes, Shifting: one-inclusion mistake bounds and sample compression, Theory of Classification: a Survey of Some Recent Advances, Greedy Strategy Works for k-Center Clustering with Outliers and Coreset Construction, Discussion on: ``Why is resorting to fate wise? A critical look at randomized algorithms in systems and control, Valid Generalisation from Approximate Interpolation, Unnamed Item, Unnamed Item, Labeled Compression Schemes for Extremal Classes, Smart PAC-Learners, VC-dimensions of finite automata and commutative finite automata with \(k\) letters and \(n\) states, Bounds on the Sample Complexity for Private Learning and Private Data Release, Unnamed Item, Neural networks and some applications to finance, Exact Learning of Discretized Geometric Concepts, On the influence of the variable ordering for algorithmic learning using OBDDs, A remark on the minimal dispersion, Unnamed Item, O-PCF algorithm for one-class classification, Unnamed Item, Unnamed Item