Neural networks with discontinuous/impact activations (Q358276)

From MaRDI portal
Revision as of 18:12, 19 March 2024 by Openalex240319060354 (talk | contribs) (Set OpenAlex properties.)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
scientific article
Language Label Description Also known as
English
Neural networks with discontinuous/impact activations
scientific article

    Statements

    Neural networks with discontinuous/impact activations (English)
    0 references
    0 references
    0 references
    16 August 2013
    0 references
    In this book some classes of neural networks are studied. More precisely, recurrent neural networks (RNN), Cohen-Grossberg neural networks, Hopfield neural networks, cellular neural networks (CNNs). These neural networks are described by difference and differential equations according to their applications in pattern recognition, associate memories, image and signal processing, artificial intelligence, etc. The common features for the neural networks under consideration are impact activation and piecewise constant arguments. In the book sufficient conditions for the global existence and uniqueness of solutions and global asymptotic stability of equilibrium for the studied neural networks are obtained. Existence of periodic solutions and their global asymptotic stability are proved for \( (\theta, \Theta) \) and \( (\Theta, \tau) \)-type impulsive neural networks. The methods are the standard Lyapunov function method and the Lyapunov-Razumikhin technique. Chapters 2 and 3 deal with differential equations with piecewise constant argument of generalized type and impulsive differential equations, respectively. Applications to the logistic equation are considered. In Chapters 4, 5 and 6, neural networks and impulsive neural networks with piecewise constant argument of generalized type are studied from the point of view of existence of unique equilibrium and stability of the periodic solutions. Paragraph 6.2.1. should be ``Existence and stability of periodic solutions''. Chapters 7 and 8 provide the application of the Lyapunov function method for studying the stability of RNN and the Lyapunov-Razumikhin technique for studying the stability of CNNs. There are several examples. Applications of these examples are not given.
    0 references
    neural networks
    0 references
    recurrent neural networks
    0 references
    cellular neural networks
    0 references
    Cohen-Grossberg neural networks
    0 references
    Hopfield neural networks
    0 references
    impulsive differential equations
    0 references
    existence
    0 references
    uniqueness
    0 references
    stability
    0 references
    periodic solutions
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references