A simple proof of a necessary and sufficient condition for absolute stability of symmetric neural networks
From MaRDI portal
Publication:4505150
DOI10.1109/81.721271zbMath1055.93548OpenAlexW2152615294MaRDI QIDQ4505150
Publication date: 26 September 2000
Published in: IEEE Transactions on Circuits and Systems I: Fundamental Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1109/81.721271
Neural networks for/in biological studies, artificial life and related topics (92B20) Stability of control systems (93D99)
Related Items
Absolute exponential stability of recurrent neural networks with Lipschitz-continuous activation functions and time delays, New necessary and sufficient conditions for absolute stability of neural networks, Peak-to-peak exponential direct learning of continuous-time recurrent neural network models: a matrix inequality approach, Passive learning and input-to-state stability of switched Hopfield neural networks with time-delay, Some new results on stability of Takagi-Sugeno fuzzy Hopfield neural networks, A new robust training law for dynamic neural networks with external disturbance: an LMI approach, Lyapunov function for interacting reinforced stochastic processes via Hopfield's energy function, Robust stability of recurrent neural networks with ISS learning algorithm, Exponential \(\mathcal H_{\infty}\) stable learning method for Takagi-Sugeno fuzzy delayed neural networks: a convex optimization approach, Necessary and sufficient condition for the absolute exponential stability of a class of neural networks with finite delay