Bounding the Vapnik-Chervonenkis dimension of concept classes parameterized by real numbers

From MaRDI portal
Publication:1900973

DOI10.1007/BF00993408zbMath0831.68087OpenAlexW4249207878MaRDI QIDQ1900973

Paul W. Goldberg, Mark R. Jerrum

Publication date: 29 October 1995

Published in: Machine Learning (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1007/bf00993408



Related Items

Learning from rounded-off data., On the Vapnik-Chervonenkis dimension of computer programs which use transcendental elementary operations, Learning distributions by their density levels: A paradigm for learning without a teacher, The VC dimension of metric balls under Fréchet and Hausdorff distances, On the generalization error of fixed combinations of classifiers, Dynamical recognizers: real-time language recognition by analog computers, The Vapnik-Chervonenkis dimension of graph and recursive neural networks, Coresets for \((k, \ell ) \)-median clustering under the Fréchet distance, Learning bounds for quantum circuits in the agnostic setting, Vapnik-Chervonenkis Dimension of Parallel Arithmetic Computations, Lower bounds on performance of metric tree indexing schemes for exact similarity search in high dimensions, Indexability, concentration, and VC theory, Approximation in shift-invariant spaces with deep ReLU neural networks, On the stability and generalization of neural networks with VC dimension and fuzzy feature encoders, A size-depth trade-off for the analog computation of Boolean functions, A tight upper bound on the generalization error of feedforward neural networks, Polynomial bounds for VC dimension of sigmoidal and general Pfaffian neural networks, Neural networks with quadratic VC dimension, On the computation of Boolean functions by analog circuits of bounded fan-in, Aggregate operators in constraint query languages, Unnamed Item, Error bounds for approximations with deep ReLU networks, Theory of Classification: a Survey of Some Recent Advances, Combinatorial variability of Vapnik-Chervonenkis classes with applications to sample compression schemes, Vapnik-Chervonenkis dimension of recurrent neural networks, Unnamed Item, The complexity of model classes, and smoothing noisy data, A learning result for continuous-time recurrent neural networks, Partitioning points by parallel planes, Marginal singularity and the benefits of labels in covariate-shift, Uniformly supported approximate equilibria in families of games, On sharpness of error bounds for univariate approximation by single hidden layer feedforward neural networks, On the complexity of learning for spiking neurons with temporal coding.



Cites Work