On the global output convergence of a class of recurrent neural networks with time-varying inputs
From MaRDI portal
Publication:557639
DOI10.1016/J.NEUNET.2004.10.005zbMATH Open1071.68537OpenAlexW1964856397WikidataQ51518798 ScholiaQ51518798MaRDI QIDQ557639FDOQ557639
Authors: Sanqing Hu, Derong Liu
Publication date: 30 June 2005
Published in: Neural Networks (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.neunet.2004.10.005
Recommendations
- Global output convergence of recurrent neural networks with distributed delays
- Global exponential periodicity of a class of recurrent neural networks with non-monotone activation functions and time-varying delays
- On global asymptotic stability of recurrent neural networks with time-varying delays.
- Global exponential periodicity and global exponential stability of a class of recurrent neural networks with various activation functions and time-varying delays
- New results on input-to-state convergence for recurrent neural networks with variable inputs
Lipschitz continuityOptimizationGlobal output convergenceLyapunov diagonal stabilityRecurrent neural networksTime-varying input
Cites Work
- Neural networks and physical systems with emergent collective computational abilities
- New conditions for global stability of neural networks with application to linear and quadratic programming problems
- Neurons with graded response have collective computational properties like those of two-state neurons
- Qualitative analysis of large scale dynamical systems
- A deterministic annealing neural network for convex programming
- Analysis and design of a recurrent neural network for linear programming
- Delay structure conditions for identifiability of closed loop systems
- New sufficient conditions for absolute stability of neural networks
- Global exponential stability of a class of neural circuits
Cited In (14)
- Title not available (Why is that?)
- Title not available (Why is that?)
- On Exponential Convergence Conditions of an Extended Projection Neural Network
- Prescribed convergence analysis of recurrent neural networks with parameter variations
- Design and analysis of a novel chaotic diagonal recurrent neural network
- Local stability of recurrent networks with time-varying weights and inputs
- Global output convergence of recurrent neural networks with distributed delays
- Global output convergence of Cohen-Grossberg neural networks with both time-varying and distributed delays
- New results on input-to-state convergence for recurrent neural networks with variable inputs
- Global Convergence Rate of Recurrently Connected Neural Networks
- Comparison of convergence and stability properties for the state and output solutions of neural networks
- Accelerating a recurrent neural network to finite-time convergence using a new design formula and its application to time-varying matrix square root
- Time varying stimulations in simple neural networks and convergence to desired outputs
- Long-Range Out-of-Sample Properties of Autoregressive Neural Networks
This page was built for publication: On the global output convergence of a class of recurrent neural networks with time-varying inputs
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q557639)