On the Complexity of Computing and Learning with Multiplicative Neural Networks
From MaRDI portal
Publication:2780854
DOI10.1162/08997660252741121zbMath0993.68083DBLPjournals/neco/Schmitt02OpenAlexW2109292177WikidataQ52047964 ScholiaQ52047964MaRDI QIDQ2780854
Publication date: 14 March 2002
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1162/08997660252741121
Related Items (12)
Sign-representation of Boolean functions using a small number of monomials ⋮ Combined weight and density bounds on the polynomial threshold function representation of Boolean functions ⋮ Logistic regression using covariates obtained by product-unit neural network models ⋮ Image compression and reconstruction using \(\mathbf{pi}_t\)-sigma neural networks ⋮ Income prediction in the agrarian sector using product unit neural networks ⋮ Many regression algorithms, one unified model: a review ⋮ Multilogistic regression by means of evolutionary product-unit neural networks ⋮ dNSP: a biologically inspired dynamic neural network approach to signal processing ⋮ Genetic algorithm-based feature set partitioning for classification problems ⋮ On the Capabilities of Higher-Order Neurons: A Radial Basis Function Approach ⋮ Evolutionary product unit based neural networks for regression ⋮ An Upper Bound on the Minimum Number of Monomials Required to Separate Dichotomies of {−1, 1}n
Cites Work
- Polynomial bounds for VC dimension of sigmoidal and general Pfaffian neural networks
- Neural networks with quadratic VC dimension
- Programmed interactions in higher-order neural networks: Maximal capacity
- Programmed interactions in higher-order neural networks: The outer- product algorithm
- Decision theoretic generalizations of the PAC model for neural net and other learning applications
- Feedforward nets for interpolation and classification
- Vapnik-Chervonenkis dimension of recurrent neural networks
- Localization vs. identification of semi-algebraic sets
- Characterizations of learnability for classes of \(\{0,\dots,n\}\)-valued functions
- Classification by polynomial surfaces
- On the sample complexity for nonoverlapping neural networks
- Regularization Algorithms for Learning That Are Equivalent to Multilayer Networks
- Computing with the Leaky Integrate-and-Fire Neuron: Logarithmic Computation and Multiplication
- Sample sizes for multiple-output threshold networks
- Über die Nachrichtenverarbeitung in der Nervenzelle
- Learnability and the Vapnik-Chervonenkis dimension
- A theory of the learnable
- A simple model for neural computation with firing rates and firing correlations
- Learning of higher-order perceptrons with tunable complexities
- VC Dimension and Uniform Learnability of Sparse Polynomials and Rational Functions
- Neural Nets with Superlinear VC-Dimension
- Information Processing in Dendritic Trees
- Constructing deterministic finite-state automata in recurrent neural networks
- Enumeration of Seven-Argument Threshold Functions
- Lower Bounds for Approximation by Nonlinear Manifolds
- A logical calculus of the ideas immanent in nervous activity
This page was built for publication: On the Complexity of Computing and Learning with Multiplicative Neural Networks