Pages that link to "Item:Q4505150"
From MaRDI portal
The following pages link to A simple proof of a necessary and sufficient condition for absolute stability of symmetric neural networks (Q4505150):
Displaying 10 items.
- Peak-to-peak exponential direct learning of continuous-time recurrent neural network models: a matrix inequality approach (Q395779) (← links)
- Some new results on stability of Takagi-Sugeno fuzzy Hopfield neural networks (Q410754) (← links)
- Exponential \(\mathcal H_{\infty}\) stable learning method for Takagi-Sugeno fuzzy delayed neural networks: a convex optimization approach (Q453804) (← links)
- Passive learning and input-to-state stability of switched Hopfield neural networks with time-delay (Q621590) (← links)
- A new robust training law for dynamic neural networks with external disturbance: an LMI approach (Q624433) (← links)
- New necessary and sufficient conditions for absolute stability of neural networks (Q866740) (← links)
- Necessary and sufficient condition for the absolute exponential stability of a class of neural networks with finite delay (Q973553) (← links)
- Absolute exponential stability of recurrent neural networks with Lipschitz-continuous activation functions and time delays (Q1883879) (← links)
- Robust stability of recurrent neural networks with ISS learning algorithm (Q2434143) (← links)
- Lyapunov function for interacting reinforced stochastic processes via Hopfield's energy function (Q6152034) (← links)