How Optimal Stimuli for Sensory Neurons Are Constrained by Network Architecture
From MaRDI portal
Publication:5453533
Recommendations
- Optimal Neuronal Tuning for Finite Stimulus Spaces
- Information maximization explains the sparseness of presynaptic neural response
- Metabolic cost of neuronal information in an empirical stimulus-response model
- A stimulus-dependent connectivity analysis of neuronal networks
- Switching neuronal state: optimal stimuli revealed using a stochastically-seeded gradient algorithm
Cites work
- Characteristics of Random Nets of Analog Neuron-Like Elements
- Neurons with graded response have collective computational properties like those of two-state neurons
- New Conditions on Global Stability of Cohen-Grossberg Neural Networks
- New conditions for global stability of neural networks with application to linear and quadratic programming problems
- The alopex process: Visual receptive fields by response feedback
Cited in
(5)- Abstract stimulus-specific adaptation models
- Optimal Neuronal Tuning for Finite Stimulus Spaces
- How to modify a neural network gradually without changing its input-output functionality
- Active data collection for efficient estimation and comparison of nonlinear neural models
- Sensitivity to Stimulus Irregularity Is Inherent in Neural Networks
This page was built for publication: How Optimal Stimuli for Sensory Neurons Are Constrained by Network Architecture
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5453533)