Convergence and stability of the split-step \(\theta\)-Milstein method for stochastic delay Hopfield neural networks (Q369736): Difference between revisions
From MaRDI portal
Set profile property. |
Set OpenAlex properties. |
||
Property / full work available at URL | |||
Property / full work available at URL: https://doi.org/10.1155/2013/169214 / rank | |||
Normal rank | |||
Property / OpenAlex ID | |||
Property / OpenAlex ID: W2082383156 / rank | |||
Normal rank |
Revision as of 21:57, 19 March 2024
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Convergence and stability of the split-step \(\theta\)-Milstein method for stochastic delay Hopfield neural networks |
scientific article |
Statements
Convergence and stability of the split-step \(\theta\)-Milstein method for stochastic delay Hopfield neural networks (English)
0 references
19 September 2013
0 references
Summary: A new splitting method designed for the numerical solutions of stochastic delay Hopfield neural networks is introduced and analysed. Under Lipschitz and linear growth conditions, this split-step \(\theta\)-Milstein method is proved to have a strong convergence of order 1 in mean-square sense, which is higher than that of existing split-step \(\theta\)-method. Further, mean-square stability of the proposed method is investigated. Numerical experiments and comparisons with existing methods illustrate the computational efficiency of our method.
0 references