Multistability in bidirectional associative memory neural networks
From MaRDI portal
Publication:637468
DOI10.1016/j.physleta.2007.12.053zbMath1220.92002MaRDI QIDQ637468
Publication date: 2 September 2011
Published in: Physics Letters. A (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.physleta.2007.12.053
65C20: Probabilistic models, generic numerical methods in probability and statistics
68T05: Learning and adaptive systems in artificial intelligence
92B20: Neural networks for/in biological studies, artificial life and related topics
Related Items
Impulsive effects on stochastic bidirectional associative memory neural networks with reaction-diffusion and leakage delays, Robustness analysis of global exponential stability in neural networks evoked by deviating argument and stochastic disturbance, Multistability analysis of delayed recurrent neural networks with a class of piecewise nonlinear activation functions, Multistability analysis for a general class of delayed Cohen-Grossberg neural networks, Multistability in impulsive hybrid Hopfield neural networks with distributed delays, Impulsive hybrid discrete-time Hopfield neural networks with delays and multistability analysis, Passivity analysis for uncertain neural networks with discrete and distributed time-varying delays, Exponential \(p\)-convergence analysis for stochastic BAM neural networks with time-varying and infinite distributed delays, Global exponential convergence of generalized chaotic systems with multiple time-varying and finite distributed delays, Coexistence and local \(\mu\)-stability of multiple equilibrium points for complex-valued Cohen-Grossberg neural networks with unbounded time-varying delays, Multistability in a class of stochastic delayed Hopfield neural networks, Multistability and instability of competitive neural networks with Mexican-hat-type activation functions, Lagrange \(\alpha\)-exponential stability and \(\alpha\)-exponential convergence for fractional-order complex-valued neural networks, Global Lagrange stability for neutral-type Cohen-Grossberg BAM neural networks with mixed time-varying delays, Multistability and multiperiodicity for a general class of delayed Cohen-Grossberg neural networks with discontinuous activation functions, Multistability and multiperiodicity of delayed bidirectional associative memory neural networks with discontinuous activation functions, Dynamical stability analysis of delayed recurrent neural networks with ring structure, Multistable learning dynamics in second-order neural networks with time-varying delays, COEXISTENCE OF MULTISTABILITY AND CHAOS IN A RING OF DISCRETE NEURAL NETWORK WITH DELAYS
Cites Work
- Unnamed Item
- Unnamed Item
- Exponential stability of high-order bidirectional associative memory neural networks with time delays
- Multistability and convergence in delayed neural networks
- Memory pattern analysis of cellular neural networks
- Multistability and multiperiodicity of delayed Cohen-Grossberg neural networks with a general class of activation functions
- Stability and Hopf bifurcation analysis on a four-neuron BAM neural network with time delays
- Coincidence degree, and nonlinear differential equations
- Globally exponentially robust stability and periodicity of delayed neural networks
- Global robust stability of delayed recurrent neural networks
- LMI-based approach for delay-dependent exponential stability analysis of BAM neural networks
- Stability and bifurcation in a neural network model with two delays.
- Boundedness and stability for Cohen--Grossberg neural network with time-varying delays
- Absolute exponential stability of recurrent neural networks with Lipschitz-continuous activation functions and time delays
- Stability Analysis of Delayed Cellular Neural Networks Described Using Cloning Templates
- Neurons with graded response have collective computational properties like those of two-state neurons.
- Multiperiodicity and Exponential Attractivity Evoked by Periodic External Inputs in Delayed Cellular Neural Networks
- Multistability in Recurrent Neural Networks