Computing with Continuous Attractors: Stability and Online Aspects
From MaRDI portal
Publication:5703544
DOI10.1162/0899766054615626zbMath1080.68638OpenAlexW2134671148WikidataQ48946611 ScholiaQ48946611MaRDI QIDQ5703544
Publication date: 8 November 2005
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1162/0899766054615626
Related Items
Continuous attractors of a class of recurrent neural networks ⋮ Dynamical Synapses Enhance Neural Information Processing: Gracefulness, Accuracy, and Mobility ⋮ Neural Information Processing with Feedback Modulations ⋮ Associative pattern recognition through macro-molecular self-assembly ⋮ Continuous attractors of a class of neural networks with a large number of neurons ⋮ Nonequilibrium Statistical Mechanics of Continuous Attractors ⋮ A Moving Bump in a Continuous Manifold: A Comprehensive Study of the Tracking Dynamics of Continuous Attractor Neural Networks ⋮ Change-Based Inference in Attractor Nets: Linear Analysis ⋮ Dynamics and Computation of Continuous Attractors ⋮ Heaviside World: Excitation and Self-Organization of Neural Fields
Cites Work
- Dynamics of pattern formation in lateral-inhibition type neural fields
- A model of visuospatial working memory in prefrontal cortex: Recurrent network and cellular bistability
- Multiple-spike waves in a one-dimensional integrate-and-fire neural network
- Population Coding with Correlation and an Unfaithful Model
- Population Coding and Decoding in a Neural Field: A Computational Study
- Optimal Short-Term Population Coding: When Fisher Information Fails
- Sequential Bayesian Decoding with a Population of Neurons
- Bayesian Computation in Recurrent Neural Circuits
- Neurons with graded response have collective computational properties like those of two-state neurons.