Bounds for the computational power and learning complexity of analog neural nets
From MaRDI portal
Publication:5248502
DOI10.1145/167088.167193zbMath1310.68182OpenAlexW2031417520MaRDI QIDQ5248502
Publication date: 7 May 2015
Published in: Proceedings of the twenty-fifth annual ACM symposium on Theory of computing - STOC '93 (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1145/167088.167193
Learning and adaptive systems in artificial intelligence (68T05) Computational difficulty of problems (lower bounds, completeness, difficulty of approximation, etc.) (68Q17)
Related Items (9)
Computing over the reals with addition and order ⋮ On the VC-dimension of depth four threshold circuits and the complexity of Boolean-valued functions ⋮ Machines Over the Reals and Non-Uniformity ⋮ Polynomial bounds for VC dimension of sigmoidal and general Pfaffian neural networks ⋮ A weak version of the Blum, Shub, and Smale model ⋮ Neural networks with quadratic VC dimension ⋮ On the computation of Boolean functions by analog circuits of bounded fan-in ⋮ On digital nondeterminism ⋮ On the computational structure of the connected components of a hard problem
This page was built for publication: Bounds for the computational power and learning complexity of analog neural nets