Generalized Lyapunov approach for convergence of neural networks with discontinuous or non-Lipschitz activations (Q2492267)

From MaRDI portal
scientific article
Language Label Description Also known as
English
Generalized Lyapunov approach for convergence of neural networks with discontinuous or non-Lipschitz activations
scientific article

    Statements

    Generalized Lyapunov approach for convergence of neural networks with discontinuous or non-Lipschitz activations (English)
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    9 June 2006
    0 references
    The authors consider a class of discontinuous neural networks described by a system of diffential equations of the form \[ \dot x=Bx+Tg(x)+I, \] where \(x\in{\mathbb R}^n\) is the vector of neuron state variables, \(B\) is a diagonal matrix with negative coefficients modeling the neuron selfinhibitions, \(T\) is the matrix of neuron interconnections, \(g(x)\) is the neuron activation function and \(I\in{\mathbb R}^n\) is the vector of neuron biasing inputs. The main hypothesis is that the components of \(g(x)\) are bounded, non-decreasing and piece-wise continuous functions. Then, a solution must be understood in the sense of Filippov. In this context, new results on global exponential convergence and global convergence in finite time are proved. The proofs make use of a generalized Lyapunov-like approach that could be of independent interest for proving convergence of other nonsmooth dynamical systems.
    0 references
    Discontinuous neural networs
    0 references
    global exponential stability
    0 references
    convergence in finite time
    0 references
    Lyapunov approach
    0 references
    generalized gradient
    0 references
    \(M\)-matrices and \(H\)-matrices
    0 references
    0 references
    0 references
    0 references

    Identifiers