New necessary and sufficient conditions for absolute stability of neural networks
From MaRDI portal
Publication:866740
DOI10.1016/J.NEUNET.2006.06.003zbMATH Open1158.68443OpenAlexW2161026336WikidataQ51150969 ScholiaQ51150969MaRDI QIDQ866740FDOQ866740
Authors: Tianguang Chu, Cichen Zhang
Publication date: 14 February 2007
Published in: Neural Networks (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.neunet.2006.06.003
Recommendations
- New sufficient conditions for absolute stability of neural networks
- Necessary and sufficient condition for absolute stability of neural networks
- A sufficient condition for absolute stability of a larger class of dynamical neural networks
- On absolute stability of a class of neural networks with feedback
- A comment on "Comments on 'Necessary and sufficient condition for absolute stability of neural networks'"
- A comment on "Comments on 'Necessary and sufficient condition for absolute stability of neural networks'"
- Necessary and sufficient condition for the absolute exponential stability of a class of neural networks with finite delay
- scientific article; zbMATH DE number 5302448
- Absolute stability criterion for discrete time neural networks
absolute stabilityglobal asymptotic stabilityneural networksexponential convergenceasymmetric connectionsolvable Lie algebra condition
Cites Work
- Title not available (Why is that?)
- Cellular neural networks: theory
- Neural networks and physical systems with emergent collective computational abilities
- Title not available (Why is that?)
- Absolute stability of global pattern formation and parallel memory storage by competitive neural networks
- Title not available (Why is that?)
- A decomposition approach to analysis of competitive-cooperative neural networks with delay
- Absolutely exponential stability of a class of neural networks with unbounded delay
- A simple proof of a necessary and sufficient condition for absolute stability of symmetric neural networks
- Collective Computation With Continuous Variables
- Comments on "Necessary and sufficient condition for absolute stability of neural networks"
- Necessary and sufficient condition for absolute stability of neural networks
- Absolute exponential stability of neural networks with a general class of activation functions
Cited In (9)
- Peak-to-peak exponential direct learning of continuous-time recurrent neural network models: a matrix inequality approach
- Some new results on stability of Takagi-Sugeno fuzzy Hopfield neural networks
- An equivalent condition for stability properties of Lotka-Volterra systems
- Exponential \(\mathcal H_{\infty}\) stable learning method for Takagi-Sugeno fuzzy delayed neural networks: a convex optimization approach
- Passive learning and input-to-state stability of switched Hopfield neural networks with time-delay
- Title not available (Why is that?)
- An \(\mathcal H_{\infty}\) approach to stability analysis of switched Hopfield neural networks with time-delay
- Robust stability of recurrent neural networks with ISS learning algorithm
- A new robust training law for dynamic neural networks with external disturbance: an LMI approach
This page was built for publication: New necessary and sufficient conditions for absolute stability of neural networks
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q866740)