Global exponential stability in Lagrange sense for recurrent neural networks with both time-varying delays and general activation functions via LMI approach (Q546172)
From MaRDI portal
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Global exponential stability in Lagrange sense for recurrent neural networks with both time-varying delays and general activation functions via LMI approach |
scientific article |
Statements
Global exponential stability in Lagrange sense for recurrent neural networks with both time-varying delays and general activation functions via LMI approach (English)
0 references
24 June 2011
0 references
The global exponential stability in a Lagrange sense for recurrent neural networks with both time-varying delays and general activation functions is studied. Based on assuming that the activation functions are neither bounded nor monotonous or differentiable, several algebraic criteria in linear matrix inequality form for the global exponential stability in a Lagrange sense of neural networks are obtained by means of Lyapunov functions and Halanay delay differential inequality. Moreover, detailed estimations for a globally exponentially attractive set of recurrent neural networks with time-varying delays is established. When the system has a unique equilibrium point, the results obtained here show that the equilibrium point is globally exponentially stable in Lyapunov sense. Finally, two examples are given and analyzed to demonstrate our results.
0 references
neural network
0 references
Lagrange exponential stability
0 references
globally exponentially attractive set
0 references
Halanay delay differential inequality
0 references
LMI approach
0 references
0 references
0 references
0 references
0 references
0 references
0 references