Convergence and stability of the split-step \(\theta\)-Milstein method for stochastic delay Hopfield neural networks (Q369736): Difference between revisions

From MaRDI portal
Importer (talk | contribs)
Created a new Item
 
Importer (talk | contribs)
Changed an Item
Property / review text
 
Summary: A new splitting method designed for the numerical solutions of stochastic delay Hopfield neural networks is introduced and analysed. Under Lipschitz and linear growth conditions, this split-step \(\theta\)-Milstein method is proved to have a strong convergence of order 1 in mean-square sense, which is higher than that of existing split-step \(\theta\)-method. Further, mean-square stability of the proposed method is investigated. Numerical experiments and comparisons with existing methods illustrate the computational efficiency of our method.
Property / review text: Summary: A new splitting method designed for the numerical solutions of stochastic delay Hopfield neural networks is introduced and analysed. Under Lipschitz and linear growth conditions, this split-step \(\theta\)-Milstein method is proved to have a strong convergence of order 1 in mean-square sense, which is higher than that of existing split-step \(\theta\)-method. Further, mean-square stability of the proposed method is investigated. Numerical experiments and comparisons with existing methods illustrate the computational efficiency of our method. / rank
 
Normal rank
Property / Mathematics Subject Classification ID
 
Property / Mathematics Subject Classification ID: 92B20 / rank
 
Normal rank
Property / Mathematics Subject Classification ID
 
Property / Mathematics Subject Classification ID: 60H10 / rank
 
Normal rank
Property / Mathematics Subject Classification ID
 
Property / Mathematics Subject Classification ID: 92-08 / rank
 
Normal rank
Property / zbMATH DE Number
 
Property / zbMATH DE Number: 6209190 / rank
 
Normal rank

Revision as of 12:31, 28 June 2023

scientific article
Language Label Description Also known as
English
Convergence and stability of the split-step \(\theta\)-Milstein method for stochastic delay Hopfield neural networks
scientific article

    Statements

    Convergence and stability of the split-step \(\theta\)-Milstein method for stochastic delay Hopfield neural networks (English)
    0 references
    0 references
    0 references
    0 references
    19 September 2013
    0 references
    Summary: A new splitting method designed for the numerical solutions of stochastic delay Hopfield neural networks is introduced and analysed. Under Lipschitz and linear growth conditions, this split-step \(\theta\)-Milstein method is proved to have a strong convergence of order 1 in mean-square sense, which is higher than that of existing split-step \(\theta\)-method. Further, mean-square stability of the proposed method is investigated. Numerical experiments and comparisons with existing methods illustrate the computational efficiency of our method.
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references