On the global output convergence of a class of recurrent neural networks with time-varying inputs
From MaRDI portal
(Redirected from Publication:557639)
Recommendations
- Global output convergence of recurrent neural networks with distributed delays
- Global exponential periodicity of a class of recurrent neural networks with non-monotone activation functions and time-varying delays
- On global asymptotic stability of recurrent neural networks with time-varying delays.
- Global exponential periodicity and global exponential stability of a class of recurrent neural networks with various activation functions and time-varying delays
- New results on input-to-state convergence for recurrent neural networks with variable inputs
Cites work
- A deterministic annealing neural network for convex programming
- Analysis and design of a recurrent neural network for linear programming
- Delay structure conditions for identifiability of closed loop systems
- Global exponential stability of a class of neural circuits
- Neural networks and physical systems with emergent collective computational abilities
- Neurons with graded response have collective computational properties like those of two-state neurons
- New conditions for global stability of neural networks with application to linear and quadratic programming problems
- New sufficient conditions for absolute stability of neural networks
- Qualitative analysis of large scale dynamical systems
Cited in
(14)- Long-Range Out-of-Sample Properties of Autoregressive Neural Networks
- scientific article; zbMATH DE number 2156307 (Why is no real title available?)
- scientific article; zbMATH DE number 811531 (Why is no real title available?)
- Prescribed convergence analysis of recurrent neural networks with parameter variations
- On Exponential Convergence Conditions of an Extended Projection Neural Network
- Design and analysis of a novel chaotic diagonal recurrent neural network
- Local stability of recurrent networks with time-varying weights and inputs
- Global output convergence of recurrent neural networks with distributed delays
- Global output convergence of Cohen-Grossberg neural networks with both time-varying and distributed delays
- New results on input-to-state convergence for recurrent neural networks with variable inputs
- Global Convergence Rate of Recurrently Connected Neural Networks
- Accelerating a recurrent neural network to finite-time convergence using a new design formula and its application to time-varying matrix square root
- Comparison of convergence and stability properties for the state and output solutions of neural networks
- Time varying stimulations in simple neural networks and convergence to desired outputs
This page was built for publication: On the global output convergence of a class of recurrent neural networks with time-varying inputs
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q557639)