Robust stability of recurrent neural networks with ISS learning algorithm
From MaRDI portal
Publication:2434143
DOI10.1007/s11071-010-9901-5zbMath1280.93065OpenAlexW2017812612MaRDI QIDQ2434143
Publication date: 17 February 2014
Published in: Nonlinear Dynamics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11071-010-9901-5
dynamic neural networkslinear matrix inequality (LMI)two-link robotic manipulatorinput-to-state stability (ISS) approachweight learning algorithm
Learning and adaptive systems in artificial intelligence (68T05) Input-output approaches in control theory (93D25) Robust stability (93D09)
Related Items
Lur'e-Postnikov Lyapunov function approach to global robust Mittag-Leffler stability of fractional-order neural networks, Event-triggered state estimation for a class of delayed recurrent neural networks with sampled-data information, Exponential state estimation for delayed recurrent neural networks with sampled-data, Peak-to-peak exponential direct learning of continuous-time recurrent neural network models: a matrix inequality approach, Input-to-state stability for dynamical neural networks with time-varying delays, Design of sampled data state estimator for Markovian jumping neural networks with leakage time-varying delays and discontinuous Lyapunov functional approach, Almost sure stability of the delayed Markovian jump rdnns, Mean-square exponential input-to-state stability of delayed Cohen-Grossberg neural networks with Markovian switching based on vector Lyapunov functions, Event-triggered synchronization control for T-S fuzzy neural networked systems with time delay
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- On characterizations of the input-to-state stability property
- Identification of nonlinear dynamical systems using multilayered neural networks
- New necessary and sufficient conditions for absolute stability of neural networks
- Comments on integral variants of ISS
- Neural networks for control systems - a survey
- Small-gain theorem for ISS systems and applications
- Dynamic neural observers and their application for identification and purification of water by ozone
- Regression and the Moore-Penrose pseudoinverse
- Further facts about input to state stabilization
- A simple proof of a necessary and sufficient condition for absolute stability of symmetric neural networks
- Input-to-state stability (ISS) analysis for dynamic neural networks
- Connections between Razumikhin-type theorems and the ISS nonlinear small gain theorem
- Lur'e systems with multilayer perceptron and recurrent neural networks: absolute stability and dissipativity
- Input-to-state stabilization of switched nonlinear systems
- Power characterizations of input-to-state stability and integral input-to-state stability
- Some stability properties of dynamic neural networks
- Smooth stabilization implies coprime factorization
- Neurons with graded response have collective computational properties like those of two-state neurons.
- Singular perturbations and input-to-state stability