Analog computation via neural networks

From MaRDI portal
Revision as of 12:56, 31 January 2024 by Import240129110113 (talk | contribs) (Created automatically from import240129110113)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:1331940

DOI10.1016/0304-3975(94)90178-3zbMath0822.68029OpenAlexW2050778826WikidataQ55897750 ScholiaQ55897750MaRDI QIDQ1331940

Hava T. Siegelmann

Publication date: 18 October 1995

Published in: Theoretical Computer Science (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1016/0304-3975(94)90178-3




Related Items (70)

A refutation of Penrose's Gödelian case against artificial intelligenceComputing over the reals with addition and orderEdge of chaos and prediction of computational performance for neural circuit modelsOn the computational power of dynamical systems and hybrid systemsThe simple dynamics of super Turing theoriesA family of universal recurrent networksGlobal exponential convergence and global convergence in finite time of non-autonomous discontinuous neural networksRecurrent Neural Networks with Small Weights Implement Definite Memory MachinesThe Computational Power of Interactive Recurrent Neural NetworksGeneral-Purpose Computation with Neural Networks: A Survey of Complexity Theoretic ResultsComputation with perturbed dynamical systemsUnnamed ItemAN ANALOGUE-DIGITAL CHURCH-TURING THESISThe promise of analog computationA note on discreteness and virtuality in analog computingSimulation of Turing machine with uEAC-computable functionsAutomata theory based on quantum logic: Some characterizationsDynamical recognizers: real-time language recognition by analog computersExpressive Power of Non-deterministic Evolving Recurrent Neural Networks in Terms of Their Attractor DynamicsThe ARNN model relativises \(\mathrm{P}=\mathrm{NP}\) and \(\mathrm{P}\neq \mathrm{NP}\)Subrecursive neural networksOn Goles' universal machines: a computational point of viewThree analog neurons are Turing universalDesign of continuous-time recurrent neural networks with piecewise-linear activation function for generation of prescribed sequences of bipolar vectorsThe Kolmogorov-Arnold representation theorem revisitedComputations with oracles that measure vanishing quantitiesThe Power of Machines That Control ExperimentsA provably stable neural network Turing machine with finite precision and timeThe expressive power of analog recurrent neural networks on infinite input streamsQuasi-periodic \(\beta\)-expansions and cut languagesComputational capabilities of analog and evolving neural networks over infinite input streamsFinite time convergent learning law for continuous neural networksOn the computational power of discrete Hopfield netsThe modal argument for hypercomputing mindsHypercomputation: Philosophical issuesAnalog computation through high-dimensional physical chaotic neuro-dynamicsOn the computational power of probabilistic and faulty neural networksAutomata complete computation with Hodgkin-Huxley neural networks composed of synfire ringsAnalog neuron hierarchyA weak version of the Blum, Shub, and Smale modelOn the computation of Boolean functions by analog circuits of bounded fan-inIteration, inequalities, and differentiability in analog computersA survey of computational complexity results in systems and controlInterval-valued computations and their connection with PSPACECut Languages in Rational BasesGeneralized Lyapunov approach for convergence of neural networks with discontinuous or non-Lipschitz activationsThe stability of saturated linear dynamical systems is undecidableReal-Time Computing Without Stable States: A New Framework for Neural Computation Based on PerturbationsA new Gödelian argument for hypercomputing minds based on the busy beaver problemThe case for hypercomputationHow much can analog and hybrid systems be proved (super-)TuringAnalog computation beyond the Turing limitThe many forms of hypercomputationAn optical model of computationExpressive power of first-order recurrent neural networks determined by their attractor dynamicsContinuous-Time Symmetric Hopfield Nets Are Computationally UniversalVapnik-Chervonenkis dimension of recurrent neural networksComputing with truly asynchronous threshold logic networksOn digital nondeterminismClosed-form analytic maps in one and two dimensions can simulate universal Turing machinesThe structure of logarithmic advice complexity classesQuantum automata and quantum grammarsFrontier between decidability and undecidability: A surveyTHE MYTH OF 'THE MYTH OF HYPERCOMPUTATION'Stochastic analog networks and computational complexityLogic and Complexity in Cognitive ScienceAnalog computation with dynamical systemsA theory of complexity for continuous time systemsContinuous-Time Symmetric Hopfield Nets Are Computationally UniversalA Survey on Analog Models of Computation



Cites Work




This page was built for publication: Analog computation via neural networks