Codimension-2 parameter space structure of continuous-time recurrent neural networks
From MaRDI portal
(Redirected from Publication:2165380)
Abstract: If we are ever to move beyond the study of isolated special cases in theoretical neuroscience, we need to develop more general theories of neural circuits over a given neural model. The present paper considers this challenge in the context of continuous-time recurrent neural networks (CTRNNs), a simple but dynamically-universal model that has been widely utilized in both computational neuroscience and neural networks. Here we extend previous work on the parameter space structure of codimension-1 local bifurcations in CTRNNs to include codimension-2 local bifurcation manifolds. Specifically, we derive the necessary conditions for all generic local codimension-2 bifurcations for general CTRNNs, specialize these conditions to circuits containing from one to four neurons, illustrate in full detail the application of these conditions to example circuits, derive closed-form expressions for these bifurcation manifolds where possible, and demonstrate how this analysis allows us to find and trace several global codimension-1 bifurcation manifolds that originate from the codimension-2 bifurcations.
Recommendations
- Parameter Space Structure of Continuous-Time Recurrent Neural Networks
- Bifurcation analysis on a generalized recurrent neural network with two interconnected three-neuron components
- Continuous attractors of a class of recurrent neural networks
- Attractive periodic sets in discrete-time recurrent networks (with emphasis on fixed-point stability and bifurcations in two-neuron networks)
- POINCARÉ MAPPING OF CONTINUOUS RECURRENT NEURAL NETWORKS EXCITED BY TEMPORAL EXTERNAL INPUT
Cites work
- scientific article; zbMATH DE number 3820765 (Why is no real title available?)
- scientific article; zbMATH DE number 3714116 (Why is no real title available?)
- scientific article; zbMATH DE number 3757349 (Why is no real title available?)
- A review of recurrent neural networks: LSTM cells and network architectures
- Attractive periodic sets in discrete-time recurrent networks (with emphasis on fixed-point stability and bifurcations in two-neuron networks)
- Bifurcation analysis of a neural network model
- Center-Crossing Recurrent Neural Networks for the Evolution of Rhythmic Behavior
- Complex dynamics and the structure of small neural networks
- Computing Hopf Bifurcations I
- Elements of applied bifurcation theory
- Introduction to Applied Nonlinear Dynamical Systems and Chaos
- Mathematical equivalence of two common forms of firing rate models of neural networks
- Mathematical foundations of neuroscience
- Modeling of continuous time dynamical systems with input by recurrent neural networks
- Neural Networks for Combinatorial Optimization: A Review of More Than a Decade of Research
- Neurons with graded response have collective computational properties like those of two-state neurons
- Nonlinear oscillations, dynamical systems, and bifurcations of vector fields
- Parameter Space Structure of Continuous-Time Recurrent Neural Networks
- Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations
- Singularities of vector fields
- Two neuron dynamics and adiabatic elimination
- Weakly connected neural networks
- ``Neural computation of decisions in optimization problems
Cited in
(2)
This page was built for publication: Codimension-2 parameter space structure of continuous-time recurrent neural networks
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2165380)