On the global output convergence of a class of recurrent neural networks with time-varying inputs
From MaRDI portal
Publication:557639
DOI10.1016/j.neunet.2004.10.005zbMath1071.68537WikidataQ51518798 ScholiaQ51518798MaRDI QIDQ557639
Publication date: 30 June 2005
Published in: Neural Networks (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.neunet.2004.10.005
Optimization; Lipschitz continuity; Global output convergence; Lyapunov diagonal stability; Recurrent neural networks; Time-varying input
68T05: Learning and adaptive systems in artificial intelligence
Related Items
Global output convergence of Cohen-Grossberg neural networks with both time-varying and distributed delays, Global output convergence of recurrent neural networks with distributed delays, Design and analysis of a novel chaotic diagonal recurrent neural network, On Exponential Convergence Conditions of an Extended Projection Neural Network, Long-Range Out-of-Sample Properties of Autoregressive Neural Networks
Cites Work
- Qualitative analysis of large scale dynamical systems
- Delay structure conditions for identifiability of closed loop systems
- A deterministic annealing neural network for convex programming
- New sufficient conditions for absolute stability of neural networks
- Analysis and design of a recurrent neural network for linear programming
- Global exponential stability of a class of neural circuits
- New conditions for global stability of neural networks with application to linear and quadratic programming problems
- Neural networks and physical systems with emergent collective computational abilities.
- Neurons with graded response have collective computational properties like those of two-state neurons.