Polynomial bounds for VC dimension of sigmoidal and general Pfaffian neural networks
From MaRDI portal
Publication:676431
DOI10.1006/JCSS.1997.1477zbMATH Open0869.68088OpenAlexW2006698588WikidataQ56214754 ScholiaQ56214754MaRDI QIDQ676431FDOQ676431
Authors: Marek Karpinski, Angus Macintyre
Publication date: 18 March 1997
Published in: Journal of Computer and System Sciences (Search for Journal in Brave)
Full work available at URL: https://ora.ox.ac.uk/objects/uuid:a14465ce-11d9-4f89-aeec-fcf0bea603ed
Recommendations
- scientific article; zbMATH DE number 1263195
- scientific article; zbMATH DE number 7064043
- Dimension-independent bounds on the degree of approximation by neural networks
- scientific article; zbMATH DE number 1304255
- Neural networks with quadratic VC dimension
- Voronovskaja type theorems and high-order convergence neural network operators with sigmoidal functions
Cites Work
- Title not available (Why is that?)
- The elementary theory of restricted analytic fields with exponentiation
- Model completeness results for expansions of the ordered field of real numbers by restricted Pfaffian functions and the exponential function
- On the Betti Numbers of Real Varieties
- The measure of the critical values of differentiable maps
- Lower Bounds for Approximation by Nonlinear Manifolds
- Feedforward nets for interpolation and classification
- Definable Sets in Ordered Structures. II
- Title not available (Why is that?)
- Title not available (Why is that?)
- Vapnik-Chervonenkis Classes of Definable Sets
- Title not available (Why is that?)
- An exact sequence in differential topology
- Decision theoretic generalizations of the PAC model for neural net and other learning applications
- Bounding the Vapnik-Chervonenkis dimension of concept classes parameterized by real numbers
- A result of Vapnik with applications
- VC Dimension and Uniform Learnability of Sparse Polynomials and Rational Functions
- Title not available (Why is that?)
- Neural Nets with Superlinear VC-Dimension
- Title not available (Why is that?)
- Finiteness results for sigmoidal “neural” networks
- Bounds for the computational power and learning complexity of analog neural nets
- On the computation of Boolean functions by analog circuits of bounded fan-in
- On the decidability of sparse univariate polynomial interpolation
Cited In (36)
- A tight upper bound on the generalization error of feedforward neural networks
- Multiscale topology optimization using neural network surrogate models
- Pfaffian sets and O-minimality
- Transfer theorems via sign conditions
- On the Capabilities of Higher-Order Neurons: A Radial Basis Function Approach
- Title not available (Why is that?)
- Negative results for approximation using single layer and multilayer feedforward neural networks
- The VC dimension of metric balls under Fréchet and Hausdorff distances
- Randomized algorithms for robust controller synthesis using statistical learning theory: a tutorial overview
- Vapnik-Chervonenkis dimension of recurrent neural networks
- On the stability and generalization of neural networks with VC dimension and fuzzy feature encoders
- On multivariate randomized classification trees: \(l_0\)-based sparsity, VC dimension and decomposition methods
- Descartes' Rule of Signs for Radial Basis Function Neural Networks
- Convergence of a least-squares Monte Carlo algorithm for American option pricing with dependent sample data
- Randomized algorithms for the synthesis of cautious adaptive controllers
- Title not available (Why is that?)
- Vapnik-Chervonenkis density in some theories without the independence property. I
- A statistical learning theory approach for uncertain linear and bilinear matrix inequalities
- On the Vapnik-Chervonenkis dimension of computer programs which use transcendental elementary operations
- Neural Networks with Local Receptive Fields and Superlinear VC Dimension
- Probabilistic solutions to some NP-hard matrix problems
- Theory of graph neural networks: representation and learning
- Model theory and agnostic online learning via excellent sets
- Vapnik-Chervonenkis Dimension of Parallel Arithmetic Computations
- Research on probabilistic methods for control system design
- Randomized algorithms for robust controller synthesis using statistical learning theory
- On sharpness of error bounds for univariate approximation by single hidden layer feedforward neural networks
- Geometric decision procedures and the VC dimension of linear arithmetic theories
- Model Theory: Geometrical and Set-Theoretic Aspects and Prospects
- Deep learning: a statistical viewpoint
- On the complexity of computing and learning with multiplicative neural networks
- The Vapnik-Chervonenkis dimension of graph and recursive neural networks
- Title not available (Why is that?)
- Partitioning points by parallel planes
- Aspects of discrete mathematics and probability in the theory of machine learning
- Theory of Classification: a Survey of Some Recent Advances
This page was built for publication: Polynomial bounds for VC dimension of sigmoidal and general Pfaffian neural networks
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q676431)