Bounding the Vapnik-Chervonenkis dimension of concept classes parameterized by real numbers
From MaRDI portal
Publication:1900973
DOI10.1007/BF00993408zbMATH Open0831.68087OpenAlexW4249207878MaRDI QIDQ1900973FDOQ1900973
Authors: Paul W. Goldberg, Mark Jerrum
Publication date: 29 October 1995
Published in: Machine Learning (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/bf00993408
Recommendations
- VC Dimension Bounds for Analytic Algebraic Computations
- Results on learnability and the Vapnik-Chervonenkis dimension
- Vapnik-Chervonenkis Dimension of Parallel Arithmetic Computations
- Learnability and the Vapnik-Chervonenkis dimension
- Complexity of computing Vapnik-Chervonenkis dimension and some generalized dimensions
Cites Work
- Some special Vapnik-Chervonenkis classes
- On the Uniform Convergence of Relative Frequencies of Events to Their Probabilities
- Learnability and the Vapnik-Chervonenkis dimension
- On the Betti Numbers of Real Varieties
- Central limit theorems for empirical measures
- A theory of the learnable
- Lower Bounds for Approximation by Nonlinear Manifolds
- Title not available (Why is that?)
- Feedforward nets for interpolation and classification
- A general lower bound on the number of examples needed for learning
- Results on learnability and the Vapnik-Chervonenkis dimension
- Occam's razor
- Probably Approximate Learning of Sets and Functions
- Vapnik-Chervonenkis Classes of Definable Sets
- Title not available (Why is that?)
- On the computational complexity and geometry of the first-order theory of the reals. I: Introduction. Preliminaries. The geometry of semi-algebraic sets. The decision problem for the existential theory of the reals
- Approximate matching of polygonal shapes
- A linear time algorithm for the Hausdorff distance between convex polygons
- Predicting \(\{ 0,1\}\)-functions on randomly drawn points
- Lower bounds for algebraic decision trees
- Finiteness results for sigmoidal “neural” networks
- Some new Vapnik-Chervonenkis classes
- Localization vs. identification of semi-algebraic sets
Cited In (49)
- A tight upper bound on the generalization error of feedforward neural networks
- Title not available (Why is that?)
- On the generalization error of fixed combinations of classifiers
- Error bounds for approximations with deep ReLU networks
- Learning from rounded-off data.
- Title not available (Why is that?)
- The VC dimension of metric balls under Fréchet and Hausdorff distances
- Vapnik-Chervonenkis dimension of recurrent neural networks
- On ordinal VC-dimension and some notions of complexity
- On the stability and generalization of neural networks with VC dimension and fuzzy feature encoders
- Aggregate operators in constraint query languages
- Coresets for \((k, \ell ) \)-median clustering under the Fréchet distance
- A learning result for continuous-time recurrent neural networks
- The complexity of model classes, and smoothing noisy data
- VC Dimension and Uniform Learnability of Sparse Polynomials and Rational Functions
- On ordinal VC-dimension and some notions of complexity
- Some new maximum VC classes
- Results on learnability and the Vapnik-Chervonenkis dimension
- Trial and error: A new approach to space-bounded learning
- Title not available (Why is that?)
- Indexability, concentration, and VC theory
- On the complexity of learning for spiking neurons with temporal coding.
- VC dimensions of principal component analysis
- Bounding sample size with the Vapnik-Chervonenkis dimension
- Dynamical recognizers: real-time language recognition by analog computers
- Polynomial bounds for VC dimension of sigmoidal and general Pfaffian neural networks
- PAC learning, VC dimension, and the arithmetic hierarchy
- On the Vapnik-Chervonenkis dimension of computer programs which use transcendental elementary operations
- Vapnik-Chervonenkis Dimension of Parallel Arithmetic Computations
- Learning distributions by their density levels: A paradigm for learning without a teacher
- Combinatorial variability of Vapnik-Chervonenkis classes with applications to sample compression schemes
- Approximation of classifiers by deep perceptron networks
- On sharpness of error bounds for univariate approximation by single hidden layer feedforward neural networks
- Upper Bound for the Number of Concepts of Contranominal-Scale Free Contexts
- Learnability and the Vapnik-Chervonenkis dimension
- VC Dimension Bounds for Analytic Algebraic Computations
- The Vapnik-Chervonenkis dimension of graph and recursive neural networks
- Learning bounds for quantum circuits in the agnostic setting
- Marginal singularity and the benefits of labels in covariate-shift
- Uniformly supported approximate equilibria in families of games
- Learning Theory
- Lower bounds on performance of metric tree indexing schemes for exact similarity search in high dimensions
- Title not available (Why is that?)
- Partitioning points by parallel planes
- A size-depth trade-off for the analog computation of Boolean functions
- Approximation in shift-invariant spaces with deep ReLU neural networks
- Theory of Classification: a Survey of Some Recent Advances
- Neural networks with quadratic VC dimension
- On the computation of Boolean functions by analog circuits of bounded fan-in
This page was built for publication: Bounding the Vapnik-Chervonenkis dimension of concept classes parameterized by real numbers
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1900973)