Bounding the Vapnik-Chervonenkis dimension of concept classes parameterized by real numbers
From MaRDI portal
Publication:1900973
DOI10.1007/BF00993408zbMath0831.68087OpenAlexW4249207878MaRDI QIDQ1900973
Paul W. Goldberg, Mark R. Jerrum
Publication date: 29 October 1995
Published in: Machine Learning (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/bf00993408
Related Items
Learning from rounded-off data., On the Vapnik-Chervonenkis dimension of computer programs which use transcendental elementary operations, Learning distributions by their density levels: A paradigm for learning without a teacher, The VC dimension of metric balls under Fréchet and Hausdorff distances, On the generalization error of fixed combinations of classifiers, Dynamical recognizers: real-time language recognition by analog computers, The Vapnik-Chervonenkis dimension of graph and recursive neural networks, Coresets for \((k, \ell ) \)-median clustering under the Fréchet distance, Learning bounds for quantum circuits in the agnostic setting, Vapnik-Chervonenkis Dimension of Parallel Arithmetic Computations, Lower bounds on performance of metric tree indexing schemes for exact similarity search in high dimensions, Indexability, concentration, and VC theory, Approximation in shift-invariant spaces with deep ReLU neural networks, On the stability and generalization of neural networks with VC dimension and fuzzy feature encoders, A size-depth trade-off for the analog computation of Boolean functions, A tight upper bound on the generalization error of feedforward neural networks, Polynomial bounds for VC dimension of sigmoidal and general Pfaffian neural networks, Neural networks with quadratic VC dimension, On the computation of Boolean functions by analog circuits of bounded fan-in, Aggregate operators in constraint query languages, Unnamed Item, Error bounds for approximations with deep ReLU networks, Theory of Classification: a Survey of Some Recent Advances, Combinatorial variability of Vapnik-Chervonenkis classes with applications to sample compression schemes, Vapnik-Chervonenkis dimension of recurrent neural networks, Unnamed Item, The complexity of model classes, and smoothing noisy data, A learning result for continuous-time recurrent neural networks, Partitioning points by parallel planes, Marginal singularity and the benefits of labels in covariate-shift, Uniformly supported approximate equilibria in families of games, On sharpness of error bounds for univariate approximation by single hidden layer feedforward neural networks, On the complexity of learning for spiking neurons with temporal coding.
Cites Work
- Unnamed Item
- Unnamed Item
- A linear time algorithm for the Hausdorff distance between convex polygons
- Results on learnability and the Vapnik-Chervonenkis dimension
- Some new Vapnik-Chervonenkis classes
- Occam's razor
- Some special Vapnik-Chervonenkis classes
- On the computational complexity and geometry of the first-order theory of the reals. I: Introduction. Preliminaries. The geometry of semi-algebraic sets. The decision problem for the existential theory of the reals
- Feedforward nets for interpolation and classification
- Central limit theorems for empirical measures
- Localization vs. identification of semi-algebraic sets
- Predicting \(\{ 0,1\}\)-functions on randomly drawn points
- A general lower bound on the number of examples needed for learning
- Approximate matching of polygonal shapes
- Probably Approximate Learning of Sets and Functions
- Learnability and the Vapnik-Chervonenkis dimension
- A theory of the learnable
- Lower bounds for algebraic decision trees
- Vapnik-Chervonenkis Classes of Definable Sets
- Finiteness results for sigmoidal “neural” networks
- Lower Bounds for Approximation by Nonlinear Manifolds
- On the Uniform Convergence of Relative Frequencies of Events to Their Probabilities
- On the Betti Numbers of Real Varieties