Global Convergence Rate of Recurrently Connected Neural Networks
From MaRDI portal
Publication:4815044
DOI10.1162/089976602760805359zbMath1079.68584OpenAlexW2098882519WikidataQ78678601 ScholiaQ78678601MaRDI QIDQ4815044
Wenlian Lu, Tian-Ping Chen, Shun-ichi Amari
Publication date: 19 August 2004
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1162/089976602760805359
Related Items (9)
The existence and exponential attractivity of \(\kappa\)-almost periodic sequence solution of discrete time neural networks ⋮ New Conditions on Global Stability of Cohen-Grossberg Neural Networks ⋮ CONVERGENCE OF DISCRETE-TIME RECURRENT NEURAL NETWORKS WITH VARIABLE DELAY ⋮ The existence and global attractivity of almost periodic sequence solution of discrete-time neural networks ⋮ Batch gradient method with smoothing \(L_{1/2}\) regularization for training of feedforward neural networks ⋮ Some Generalized Sufficient Convergence Criteria for Nonlinear Continuous Neural Networks ⋮ Dynamical Behaviors of a Large Class of General Delayed Neural Networks ⋮ Stability analysis of a class of generalized neural networks with delays ⋮ Dynamical behaviors of Cohen-Grossberg neural networks with discontinuous activation functions
Cites Work
- Absolute exponential stability of neural networks with a general class of activation functions
- Necessary and sufficient condition for absolute stability of neural networks
- Global exponential stability of a class of neural circuits
- On a class of globally stable neural circuits
- New conditions for global stability of neural networks with application to linear and quadratic programming problems
- Neurons with graded response have collective computational properties like those of two-state neurons.
This page was built for publication: Global Convergence Rate of Recurrently Connected Neural Networks