Analog computation via neural networks
From MaRDI portal
Publication:1331940
DOI10.1016/0304-3975(94)90178-3zbMath0822.68029OpenAlexW2050778826WikidataQ55897750 ScholiaQ55897750MaRDI QIDQ1331940
Publication date: 18 October 1995
Published in: Theoretical Computer Science (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/0304-3975(94)90178-3
Related Items (70)
A refutation of Penrose's Gödelian case against artificial intelligence ⋮ Computing over the reals with addition and order ⋮ Edge of chaos and prediction of computational performance for neural circuit models ⋮ On the computational power of dynamical systems and hybrid systems ⋮ The simple dynamics of super Turing theories ⋮ A family of universal recurrent networks ⋮ Global exponential convergence and global convergence in finite time of non-autonomous discontinuous neural networks ⋮ Recurrent Neural Networks with Small Weights Implement Definite Memory Machines ⋮ The Computational Power of Interactive Recurrent Neural Networks ⋮ General-Purpose Computation with Neural Networks: A Survey of Complexity Theoretic Results ⋮ Computation with perturbed dynamical systems ⋮ Unnamed Item ⋮ AN ANALOGUE-DIGITAL CHURCH-TURING THESIS ⋮ The promise of analog computation ⋮ A note on discreteness and virtuality in analog computing ⋮ Simulation of Turing machine with uEAC-computable functions ⋮ Automata theory based on quantum logic: Some characterizations ⋮ Dynamical recognizers: real-time language recognition by analog computers ⋮ Expressive Power of Non-deterministic Evolving Recurrent Neural Networks in Terms of Their Attractor Dynamics ⋮ The ARNN model relativises \(\mathrm{P}=\mathrm{NP}\) and \(\mathrm{P}\neq \mathrm{NP}\) ⋮ Subrecursive neural networks ⋮ On Goles' universal machines: a computational point of view ⋮ Three analog neurons are Turing universal ⋮ Design of continuous-time recurrent neural networks with piecewise-linear activation function for generation of prescribed sequences of bipolar vectors ⋮ The Kolmogorov-Arnold representation theorem revisited ⋮ Computations with oracles that measure vanishing quantities ⋮ The Power of Machines That Control Experiments ⋮ A provably stable neural network Turing machine with finite precision and time ⋮ The expressive power of analog recurrent neural networks on infinite input streams ⋮ Quasi-periodic \(\beta\)-expansions and cut languages ⋮ Computational capabilities of analog and evolving neural networks over infinite input streams ⋮ Finite time convergent learning law for continuous neural networks ⋮ On the computational power of discrete Hopfield nets ⋮ The modal argument for hypercomputing minds ⋮ Hypercomputation: Philosophical issues ⋮ Analog computation through high-dimensional physical chaotic neuro-dynamics ⋮ On the computational power of probabilistic and faulty neural networks ⋮ Automata complete computation with Hodgkin-Huxley neural networks composed of synfire rings ⋮ Analog neuron hierarchy ⋮ A weak version of the Blum, Shub, and Smale model ⋮ On the computation of Boolean functions by analog circuits of bounded fan-in ⋮ Iteration, inequalities, and differentiability in analog computers ⋮ A survey of computational complexity results in systems and control ⋮ Interval-valued computations and their connection with PSPACE ⋮ Cut Languages in Rational Bases ⋮ Generalized Lyapunov approach for convergence of neural networks with discontinuous or non-Lipschitz activations ⋮ The stability of saturated linear dynamical systems is undecidable ⋮ Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations ⋮ A new Gödelian argument for hypercomputing minds based on the busy beaver problem ⋮ The case for hypercomputation ⋮ How much can analog and hybrid systems be proved (super-)Turing ⋮ Analog computation beyond the Turing limit ⋮ The many forms of hypercomputation ⋮ An optical model of computation ⋮ Expressive power of first-order recurrent neural networks determined by their attractor dynamics ⋮ Continuous-Time Symmetric Hopfield Nets Are Computationally Universal ⋮ Vapnik-Chervonenkis dimension of recurrent neural networks ⋮ Computing with truly asynchronous threshold logic networks ⋮ On digital nondeterminism ⋮ Closed-form analytic maps in one and two dimensions can simulate universal Turing machines ⋮ The structure of logarithmic advice complexity classes ⋮ Quantum automata and quantum grammars ⋮ Frontier between decidability and undecidability: A survey ⋮ THE MYTH OF 'THE MYTH OF HYPERCOMPUTATION' ⋮ Stochastic analog networks and computational complexity ⋮ Logic and Complexity in Cognitive Science ⋮ Analog computation with dynamical systems ⋮ A theory of complexity for continuous time systems ⋮ Continuous-Time Symmetric Hopfield Nets Are Computationally Universal ⋮ A Survey on Analog Models of Computation
Cites Work
- Turing machines that take advice
- The complexity of analog computation
- The promise of analog computation
- Constant Depth Reducibility
- On connectionist models
- On a theory of computation and complexity over the real numbers: 𝑁𝑃- completeness, recursive functions and universal machines
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: Analog computation via neural networks