On the global output convergence of a class of recurrent neural networks with time-varying inputs
From MaRDI portal
Publication:557639
DOI10.1016/J.NEUNET.2004.10.005zbMath1071.68537OpenAlexW1964856397WikidataQ51518798 ScholiaQ51518798MaRDI QIDQ557639
Publication date: 30 June 2005
Published in: Neural Networks (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.neunet.2004.10.005
OptimizationLipschitz continuityGlobal output convergenceLyapunov diagonal stabilityRecurrent neural networksTime-varying input
Related Items (5)
Global output convergence of recurrent neural networks with distributed delays ⋮ Global output convergence of Cohen-Grossberg neural networks with both time-varying and distributed delays ⋮ On Exponential Convergence Conditions of an Extended Projection Neural Network ⋮ Long-Range Out-of-Sample Properties of Autoregressive Neural Networks ⋮ Design and analysis of a novel chaotic diagonal recurrent neural network
Cites Work
- Qualitative analysis of large scale dynamical systems
- Delay structure conditions for identifiability of closed loop systems
- A deterministic annealing neural network for convex programming
- New sufficient conditions for absolute stability of neural networks
- Analysis and design of a recurrent neural network for linear programming
- Global exponential stability of a class of neural circuits
- New conditions for global stability of neural networks with application to linear and quadratic programming problems
- Neural networks and physical systems with emergent collective computational abilities.
- Neurons with graded response have collective computational properties like those of two-state neurons.
This page was built for publication: On the global output convergence of a class of recurrent neural networks with time-varying inputs