Necessary and sufficient condition for absolute stability of neural networks

From MaRDI portal
Publication:4255596

DOI10.1109/81.298364zbMath0925.92014OpenAlexW2103074333MaRDI QIDQ4255596

Stefano Manetti, Mauro Forti, Mauro Marini

Publication date: 18 August 1999

Published in: IEEE Transactions on Circuits and Systems I: Fundamental Theory and Applications (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1109/81.298364



Related Items

TWO-LAYER STABILIZATION OF CONTINUOUS NEURAL NETWORKS WITH FEEDBACKS, Stability of artificial neural networks with impulses, Absolute exponential stability of recurrent neural networks with Lipschitz-continuous activation functions and time delays, Global asymptotic stability of Hopfield neural network involving distributed delays, Global exponential stability in DCNNs with distributed delays and unbounded activations, Global exponential convergence and global convergence in finite time of non-autonomous discontinuous neural networks, Robust stability of stochastic delayed additive neural networks with Markovian switching, Exponential Convergence of Delayed Dynamical Systems, Global Convergence Rate of Recurrently Connected Neural Networks, New absolute stability criteria for uncertain Lur'e systems with time-varying delays, Novel results concerning global robust stability of delayed neural networks, New necessary and sufficient conditions for absolute stability of neural networks, Metaheuristics: A bibliography, Robust stability analysis of uncertain stochastic neural networks with interval time-varying delay, Global asymptotic stability of a class of feedback neural networks with an application to optimization problems, Global properties for a class of dynamical neural circuits, Centralized and decentralized global outer-synchronization of asymmetric recurrent time-varying neural network by data-sampling, Dynamic analysis of unstable Hopfield networks, Global exponential periodicity and global exponential stability of a class of recurrent neural networks, Lyapunov function for interacting reinforced stochastic processes via Hopfield's energy function, ROBUST SYNCHRONIZATION CRITERIA FOR RECURRENT NEURAL NETWORKS VIA LINEAR FEEDBACK, Exponential stability of Cohen-Grossberg neural networks with a general class of activation functions, Global point dissipativity of neural networks with mixed time-varying delays, Necessary and sufficient conditions for the existence of a Lyapunov function with ‘a quadratic form plus an integral term’, ${{ \mathcal H }}_{\infty }$ synchronization of chaotic Hopfield networks with time-varying delay: a resilient DOF control approach, Exponential stability of neural networks with asymmetric connection weights, Absolute exponential stability analysis of delayed neural networks, GLOBAL EXPONENTIAL ROBUST STABILITY OF DELAYED NEURAL NETWORKS, Absolute exponential stability analysis of delayed bi-directional associative memory neural networks, Globally exponentially robust stability and periodicity of delayed neural networks, Necessary and sufficient condition for the absolute exponential stability of a class of neural networks with finite delay, A new method for exponential stability of coupled reaction-diffusion systems with mixed delays: combining Razumikhin method with graph theory, Absolute exponential stability criteria for a class of nonlinear time-delay systems, Stability analysis of Hopfield neural networks with uncertainty, Global robust stability of delayed recurrent neural networks, Neural network adaptive robust control with application to precision motion control of linear motors, Multistability of neural networks with discontinuous activation function, Dynamical behaviors of a class of recurrent neural networks with discontinuous neuron activations, Adaptive robust convergence of neural networks with time-varying delays, Exponential stability analysis of uncertain stochastic neural networks with multiple delays, A weak condition for global stability of delayed neural networks, Global exponential stability of Hopfield neural networks with delays and inverse Lipschitz neuron activations, A weak condition of globally asymptotic stability for neural networks, Extended dissipative learning of time-delay recurrent neural networks