Dirac distributions and threshold firing in neural networks (Q807442): Difference between revisions
From MaRDI portal
Added link to MaRDI item. |
Set profile property. |
||
Property / MaRDI profile type | |||
Property / MaRDI profile type: MaRDI publication profile / rank | |||
Normal rank |
Revision as of 01:17, 5 March 2024
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Dirac distributions and threshold firing in neural networks |
scientific article |
Statements
Dirac distributions and threshold firing in neural networks (English)
0 references
1989
0 references
We extend our earlier work on positive reciprocal kernels of Fredholm integral operators [J. Math. Psychol. 31, 83-92 (1987; Zbl 0609.92038)] to study firings and their synthesis in neural networks. First we show that, in general, neural response in both spontaneous and nonspontaneous firing give rise to generalized functions of the Dirac type. For spontaneous unstimulated firing we solve a homogeneous eigenvalue equation and obtain a family of gamma type functions. Finite linear combinations of these functions are dense in Sobolev spaces. The solution of the inhomogeneous equation representing nonspontaneous firing belongs to these spaces. Next we show that according to known facts about neural networks, the forcing function of the inhomogeneous equation is a linear combination of the above functions and can be used to represent the synthesis of stimuli within a neuron causing it to fire. We also show that the solution of the inhomogeneous equation can be expressed as a linear combination of the basic functions describing the neural response to those stimuli. The need for a firing threshold, characteristic of the Dirac distribution, emerges as a necessary condition for the existence of a solution. Second, we study the synthesis of the response of several neurons in both hierarchic and feedback network arrangements. The analysis is then briefly generalized to examine response to several stimuli and represent it as a direct sum of topological spaces. One observation is that generalized functions are appropriate representations of neural firings. Another is that understanding the structure of this representation is facilitated by the inevitable use of a fundamental set of dense functions to deal with the operations of a very complex system.
0 references
membrane potential
0 references
positive reciprocal kernels of Fredholm integral operators
0 references
neural networks
0 references
homogeneous eigenvalue equation
0 references
gamma type functions
0 references
Sobolev spaces
0 references
firing threshold
0 references
Dirac distribution
0 references
neural firings
0 references