Theory of Recurrent Neural Network with Common Synaptic Inputs
From MaRDI portal
Publication:3371809
Abstract: We discuss the effects of common synaptic inputs in a recurrent neural network. Because of the effects of these common synaptic inputs, the correlation between neural inputs cannot be ignored, and thus the network exhibits sample dependence. Networks of this type do not have well-defined thermodynamic limits, and self-averaging breaks down. We therefore need to develop a suitable theory without relying on these common properties. While the effects of the common synaptic inputs have been analyzed in layered neural networks, it was apparently difficult to analyze these effects in recurrent neural networks due to feedback connections. We investigated a sequential associative memory model as an example of recurrent networks and succeeded in deriving a macroscopic dynamical description as a recurrence relation form of a probability density function.
Recommendations
- A recurrent neural network with ever changing synapses
- Associative memory by recurrent neural networks with delay elements
- Symmetric sequence processing in a recurrent neural network model with a synchronous dynamics
- Dynamical responses of chaotic memory dynamics to weak input in a recurrent neural network model
- Correlation of Firing in Layered Associative Neural Networks
This page was built for publication: Theory of Recurrent Neural Network with Common Synaptic Inputs
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3371809)