Bounding the Vapnik-Chervonenkis dimension of concept classes parameterized by real numbers
From MaRDI portal
(Redirected from Publication:1900973)
Recommendations
- VC Dimension Bounds for Analytic Algebraic Computations
- Results on learnability and the Vapnik-Chervonenkis dimension
- Vapnik-Chervonenkis Dimension of Parallel Arithmetic Computations
- Learnability and the Vapnik-Chervonenkis dimension
- Complexity of computing Vapnik-Chervonenkis dimension and some generalized dimensions
Cites work
- scientific article; zbMATH DE number 53984 (Why is no real title available?)
- scientific article; zbMATH DE number 3311772 (Why is no real title available?)
- A general lower bound on the number of examples needed for learning
- A linear time algorithm for the Hausdorff distance between convex polygons
- A theory of the learnable
- Approximate matching of polygonal shapes
- Central limit theorems for empirical measures
- Feedforward nets for interpolation and classification
- Finiteness results for sigmoidal “neural” networks
- Learnability and the Vapnik-Chervonenkis dimension
- Localization vs. identification of semi-algebraic sets
- Lower Bounds for Approximation by Nonlinear Manifolds
- Lower bounds for algebraic decision trees
- Occam's razor
- On the Betti Numbers of Real Varieties
- On the Uniform Convergence of Relative Frequencies of Events to Their Probabilities
- On the computational complexity and geometry of the first-order theory of the reals. I: Introduction. Preliminaries. The geometry of semi-algebraic sets. The decision problem for the existential theory of the reals
- Predicting \(\{ 0,1\}\)-functions on randomly drawn points
- Probably Approximate Learning of Sets and Functions
- Results on learnability and the Vapnik-Chervonenkis dimension
- Some new Vapnik-Chervonenkis classes
- Some special Vapnik-Chervonenkis classes
- Vapnik-Chervonenkis Classes of Definable Sets
Cited in
(49)- The VC dimension of metric balls under Fréchet and Hausdorff distances
- Combinatorial variability of Vapnik-Chervonenkis classes with applications to sample compression schemes
- On the Vapnik-Chervonenkis dimension of computer programs which use transcendental elementary operations
- On the stability and generalization of neural networks with VC dimension and fuzzy feature encoders
- Vapnik-Chervonenkis dimension of recurrent neural networks
- Learning Theory
- Aggregate operators in constraint query languages
- Bounding sample size with the Vapnik-Chervonenkis dimension
- Learning from rounded-off data.
- On ordinal VC-dimension and some notions of complexity
- Partitioning points by parallel planes
- On sharpness of error bounds for univariate approximation by single hidden layer feedforward neural networks
- Learnability and the Vapnik-Chervonenkis dimension
- Vapnik-Chervonenkis Dimension of Parallel Arithmetic Computations
- Approximation of classifiers by deep perceptron networks
- Learning bounds for quantum circuits in the agnostic setting
- Lower bounds on performance of metric tree indexing schemes for exact similarity search in high dimensions
- Learning distributions by their density levels: A paradigm for learning without a teacher
- A tight upper bound on the generalization error of feedforward neural networks
- Dynamical recognizers: real-time language recognition by analog computers
- Neural networks with quadratic VC dimension
- On the computation of Boolean functions by analog circuits of bounded fan-in
- scientific article; zbMATH DE number 1966611 (Why is no real title available?)
- Approximation in shift-invariant spaces with deep ReLU neural networks
- Polynomial bounds for VC dimension of sigmoidal and general Pfaffian neural networks
- Results on learnability and the Vapnik-Chervonenkis dimension
- On the complexity of learning for spiking neurons with temporal coding.
- VC dimensions of principal component analysis
- A size-depth trade-off for the analog computation of Boolean functions
- On ordinal VC-dimension and some notions of complexity
- scientific article; zbMATH DE number 67631 (Why is no real title available?)
- Coresets for \((k, \ell ) \)-median clustering under the Fréchet distance
- Trial and error: A new approach to space-bounded learning
- Marginal singularity and the benefits of labels in covariate-shift
- Uniformly supported approximate equilibria in families of games
- VC Dimension Bounds for Analytic Algebraic Computations
- scientific article; zbMATH DE number 7559228 (Why is no real title available?)
- On the generalization error of fixed combinations of classifiers
- Upper Bound for the Number of Concepts of Contranominal-Scale Free Contexts
- Error bounds for approximations with deep ReLU networks
- Indexability, concentration, and VC theory
- Some new maximum VC classes
- Theory of Classification: a Survey of Some Recent Advances
- A learning result for continuous-time recurrent neural networks
- The complexity of model classes, and smoothing noisy data
- scientific article; zbMATH DE number 7064043 (Why is no real title available?)
- PAC learning, VC dimension, and the arithmetic hierarchy
- VC Dimension and Uniform Learnability of Sparse Polynomials and Rational Functions
- The Vapnik-Chervonenkis dimension of graph and recursive neural networks
This page was built for publication: Bounding the Vapnik-Chervonenkis dimension of concept classes parameterized by real numbers
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1900973)