Convergence and stability of the split-step \(\theta\)-Milstein method for stochastic delay Hopfield neural networks (Q369736): Difference between revisions
From MaRDI portal
Created claim: Wikidata QID (P12): Q58915420, #quickstatements; #temporary_batch_1705872010755 |
Added link to MaRDI item. |
||
links / mardi / name | links / mardi / name | ||
Revision as of 03:02, 30 January 2024
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Convergence and stability of the split-step \(\theta\)-Milstein method for stochastic delay Hopfield neural networks |
scientific article |
Statements
Convergence and stability of the split-step \(\theta\)-Milstein method for stochastic delay Hopfield neural networks (English)
0 references
19 September 2013
0 references
Summary: A new splitting method designed for the numerical solutions of stochastic delay Hopfield neural networks is introduced and analysed. Under Lipschitz and linear growth conditions, this split-step \(\theta\)-Milstein method is proved to have a strong convergence of order 1 in mean-square sense, which is higher than that of existing split-step \(\theta\)-method. Further, mean-square stability of the proposed method is investigated. Numerical experiments and comparisons with existing methods illustrate the computational efficiency of our method.
0 references