Further stability criterion on delayed recurrent neural networks based on reciprocal convex technique (Q1955239)

From MaRDI portal
scientific article
Language Label Description Also known as
English
Further stability criterion on delayed recurrent neural networks based on reciprocal convex technique
scientific article

    Statements

    Further stability criterion on delayed recurrent neural networks based on reciprocal convex technique (English)
    0 references
    0 references
    0 references
    0 references
    11 June 2013
    0 references
    Summary: Together with Lyapunov-Krasovskii functional theory and reciprocal convex technique, a new sufficient condition is derived to guarantee the global stability for recurrent neural networks with both time-varying and continuously distributed delays, in which one improved delay-partitioning technique is employed. The LMI-based criterion heavily depends on both the upper and lower bounds on state delay and its derivative, which is different from the existent ones and has more application areas as the lower bound of delay derivative is available. Finally, some numerical examples can illustrate the reduced conservatism of the derived results by thinning the delay interval.
    0 references

    Identifiers