Variable threshold as a model for selective attention, (de)sensitization, and anesthesia in associative neural networks (Q922986): Difference between revisions

From MaRDI portal
Import240304020342 (talk | contribs)
Set profile property.
ReferenceBot (talk | contribs)
Changed an Item
 
Property / cites work
 
Property / cites work: Neural networks: a biased overview. / rank
 
Normal rank
Property / cites work
 
Property / cites work: The space of interactions in neural network models / rank
 
Normal rank
Property / cites work
 
Property / cites work: Neural networks and physical systems with emergent collective computational abilities. / rank
 
Normal rank
Property / cites work
 
Property / cites work: Neurons with graded response have collective computational properties like those of two-state neurons. / rank
 
Normal rank
Property / cites work
 
Property / cites work: Self-organization and associative memory. / rank
 
Normal rank
Property / cites work
 
Property / cites work: Mean-field analysis of hierarchical associative networks with 'magnetisation' / rank
 
Normal rank
Property / cites work
 
Property / cites work: The existence of persistent states in the brain / rank
 
Normal rank
Property / cites work
 
Property / cites work: A logical calculus of the ideas immanent in nervous activity / rank
 
Normal rank
Property / cites work
 
Property / cites work: Stochastic Dynamics of Neural Networks / rank
 
Normal rank

Latest revision as of 12:00, 21 June 2024

scientific article
Language Label Description Also known as
English
Variable threshold as a model for selective attention, (de)sensitization, and anesthesia in associative neural networks
scientific article

    Statements

    Variable threshold as a model for selective attention, (de)sensitization, and anesthesia in associative neural networks (English)
    0 references
    0 references
    1991
    0 references
    We study the influence of a variable neuronal threshold on fixed points and convergence rates of an associative neural network in the presence of noise. We allow a random distribution in the activity levels of the patterns stored, and a modification to the standard Hebbian learning rule is proposed for this purpose. There is a threshold at which the retrieval ability, including the average final overlap and the convergence rate, is optimized for patterns with a particular activity level at a given noise level. This type of selective attention to one class of patterns with a certain activity level may be obtained at the cost of reducing the retrieval ability of the network for patterns with different activity levels. The effects of a constant threshold independent of noise, time, and pattern are discussed. For high-(low-) activity patterns, the average final overlap is shown to be increased at high noise levels and decreased at low noise levels by a negative (positive) constant threshold, whereas a positive (negative) threshold always reduces the final average overlap. When the magnitude of the constant threshold exceeds a critical value, there is no retrieval. Rates of convergence towards the stored pattern with negative (positive) thresholds are greater than those with positive (negative) thresholds. These results are related to (de)sensitization and anesthesia.
    0 references
    0 references
    variable neuronal threshold
    0 references
    convergence rates
    0 references
    associative neural network
    0 references
    noise
    0 references
    Hebbian learning
    0 references
    retrieval ability
    0 references
    sensitization
    0 references
    anesthesia
    0 references
    0 references