Bounds for the Computational Power and Learning Complexity of Analog Neural Nets
DOI10.1137/S0097539793256041zbMATH Open0870.68061OpenAlexW2066915075MaRDI QIDQ4337641FDOQ4337641
Publication date: 26 May 1997
Published in: SIAM Journal on Computing (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1137/s0097539793256041
Learning and adaptive systems in artificial intelligence (68T05) Neural networks for/in biological studies, artificial life and related topics (92B20) Complexity classes (hierarchies, relations among complexity classes, etc.) (68Q15) Analytic circuit theory (94C05)
Cited In (10)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Piecewise-Linear Neural Networks and Their Relationship to Rule Extraction from Data
- Rational approximation techniques for analysis of neural networks
- General-Purpose Computation with Neural Networks: A Survey of Complexity Theoretic Results
- Stronger connections between circuit analysis and circuit lower bounds, via PCPs of proximity
- Discrete mathematics of neural networks. Selected topics
- Computing with discrete multi-valued neurons
- Analog computation via neural networks
Recommendations
This page was built for publication: Bounds for the Computational Power and Learning Complexity of Analog Neural Nets
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4337641)