Computational power of neural networks: a characterization in terms of Kolmogorov complexity
From MaRDI portal
Publication:4345611
DOI10.1109/18.605580zbMath0878.68103OpenAlexW2099977515WikidataQ56058051 ScholiaQ56058051MaRDI QIDQ4345611
Ricard Gavaldà, José L. Balcázar, Hava T. Siegelmann
Publication date: 23 July 1997
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1109/18.605580
Learning and adaptive systems in artificial intelligence (68T05) Neural networks for/in biological studies, artificial life and related topics (92B20) Algorithmic information theory (Kolmogorov complexity, etc.) (68Q30)
Related Items
General-Purpose Computation with Neural Networks: A Survey of Complexity Theoretic Results, Subrecursive neural networks, Three analog neurons are Turing universal, The expressive power of analog recurrent neural networks on infinite input streams, Quasi-periodic \(\beta\)-expansions and cut languages, Computational capabilities of analog and evolving neural networks over infinite input streams, Oracles and Advice as Measurements, Automata complete computation with Hodgkin-Huxley neural networks composed of synfire rings, Analog neuron hierarchy, Kobayashi compressibility, Cut Languages in Rational Bases, A Hierarchy for $$ BPP //\log \!\star $$ B P P / / log ⋆ Based on Counting Calls to an Oracle, Continuous-Time Symmetric Hopfield Nets Are Computationally Universal, The structure of logarithmic advice complexity classes, Stochastic analog networks and computational complexity, Continuous-Time Symmetric Hopfield Nets Are Computationally Universal