Necessary and sufficient condition for the absolute exponential stability of a class of neural networks with finite delay
DOI10.1016/j.physleta.2005.11.038zbMath1187.34100OpenAlexW2075385310WikidataQ61891390 ScholiaQ61891390MaRDI QIDQ973553
Tingwen Huang, Chuandong Li, Cao, Jinde
Publication date: 2 June 2010
Published in: Physics Letters. A (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.physleta.2005.11.038
exponential stabilityneural networksM-matrixnecessary and sufficient conditionfinite delayabsolute exponential stability
Lyapunov and other classical stabilities (Lagrange, Poisson, (L^p, l^p), etc.) in control theory (93D05) Stability theory of functional-differential equations (34K20) Dynamical systems in control (37N35)
Related Items (9)
Cites Work
- Unnamed Item
- Absolute exponential stability of recurrent neural networks with Lipschitz-continuous activation functions and time delays
- A condition for global convergence of a class of symmetric neural circuits
- Necessary and sufficient condition for absolute stability of neural networks
- A simple proof of a necessary and sufficient condition for absolute stability of symmetric neural networks
- New conditions for global stability of neural networks with application to linear and quadratic programming problems
This page was built for publication: Necessary and sufficient condition for the absolute exponential stability of a class of neural networks with finite delay