Stability of discrete time recurrent neural networks and nonlinear optimization problems

From MaRDI portal
Publication:1667540

DOI10.1016/J.NEUNET.2015.10.013zbMATH Open1395.39008arXiv1503.01818OpenAlexW1631735240WikidataQ50758046 ScholiaQ50758046MaRDI QIDQ1667540FDOQ1667540

Jayant Singh, Nikita E. Barabanov

Publication date: 30 August 2018

Published in: Neural Networks (Search for Journal in Brave)

Abstract: We consider the method of Reduction of Dissipativity Domain to prove global Lyapunov stability of Discrete Time Recurrent Neural Networks. The standard and advanced criteria for Absolute Stability of these essentially nonlinear systems produce rather weak results. The method mentioned above is proved to be more powerful. It involves a multi-step procedure with maximization of special nonconvex functions over polytopes on every step. We derive conditions which guarantee an existence of at most one point of local maximum for such functions over every hyperplane. This nontrivial result is valid for wide range of neuron transfer functions.


Full work available at URL: https://arxiv.org/abs/1503.01818





Cites Work


Cited In (11)






This page was built for publication: Stability of discrete time recurrent neural networks and nonlinear optimization problems

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1667540)