Bounding the Vapnik-Chervonenkis dimension of concept classes parameterized by real numbers

From MaRDI portal
Revision as of 14:53, 1 February 2024 by Import240129110113 (talk | contribs) (Created automatically from import240129110113)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:1900973

DOI10.1007/BF00993408zbMath0831.68087OpenAlexW4249207878MaRDI QIDQ1900973

Paul W. Goldberg, Mark R. Jerrum

Publication date: 29 October 1995

Published in: Machine Learning (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1007/bf00993408





Cites Work


Related Items (34)

Learning from rounded-off data.On the Vapnik-Chervonenkis dimension of computer programs which use transcendental elementary operationsLearning distributions by their density levels: A paradigm for learning without a teacherThe VC dimension of metric balls under Fréchet and Hausdorff distancesOn the generalization error of fixed combinations of classifiersDynamical recognizers: real-time language recognition by analog computersThe Vapnik-Chervonenkis dimension of graph and recursive neural networksCoresets for \((k, \ell ) \)-median clustering under the Fréchet distanceLearning bounds for quantum circuits in the agnostic settingVapnik-Chervonenkis Dimension of Parallel Arithmetic ComputationsLower bounds on performance of metric tree indexing schemes for exact similarity search in high dimensionsIndexability, concentration, and VC theoryApproximation in shift-invariant spaces with deep ReLU neural networksOn the stability and generalization of neural networks with VC dimension and fuzzy feature encodersA size-depth trade-off for the analog computation of Boolean functionsA tight upper bound on the generalization error of feedforward neural networksPolynomial bounds for VC dimension of sigmoidal and general Pfaffian neural networksNeural networks with quadratic VC dimensionOn the computation of Boolean functions by analog circuits of bounded fan-inAggregate operators in constraint query languagesUnnamed ItemApproximation of classifiers by deep perceptron networksError bounds for approximations with deep ReLU networksTheory of Classification: a Survey of Some Recent AdvancesCombinatorial variability of Vapnik-Chervonenkis classes with applications to sample compression schemesVapnik-Chervonenkis dimension of recurrent neural networksUnnamed ItemThe complexity of model classes, and smoothing noisy dataA learning result for continuous-time recurrent neural networksPartitioning points by parallel planesMarginal singularity and the benefits of labels in covariate-shiftUniformly supported approximate equilibria in families of gamesOn sharpness of error bounds for univariate approximation by single hidden layer feedforward neural networksOn the complexity of learning for spiking neurons with temporal coding.





This page was built for publication: Bounding the Vapnik-Chervonenkis dimension of concept classes parameterized by real numbers