On the stability properties of gated recurrent units neural networks (Q2059489): Difference between revisions

From MaRDI portal
RedirectionBot (talk | contribs)
Removed claims
ReferenceBot (talk | contribs)
Changed an Item
 
(4 intermediate revisions by 4 users not shown)
Property / author
 
Property / author: Riccardo Scattolini / rank
 
Normal rank
Property / reviewed by
 
Property / reviewed by: P. V. Feketa / rank
 
Normal rank
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI publication profile / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W3102807666 / rank
 
Normal rank
Property / arXiv ID
 
Property / arXiv ID: 2011.06806 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Multilayer feedforward networks are universal approximators / rank
 
Normal rank
Property / cites work
 
Property / cites work: Neural networks for control systems - a survey / rank
 
Normal rank
Property / cites work
 
Property / cites work: Recurrent neural network-based model predictive control for continuous pharmaceutical manufacturing / rank
 
Normal rank
Property / cites work
 
Property / cites work: Global asymptotic stability and stabilization of long short-term memory neural networks with constant weights and biases / rank
 
Normal rank
Property / cites work
 
Property / cites work: Input-to-state stability for discrete-time nonlinear systems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Moving-horizon state estimation for nonlinear discrete-time systems: new stability results and approximation schemes / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5270493 / rank
 
Normal rank
Property / cites work
 
Property / cites work: A Lyapunov approach to incremental stability properties / rank
 
Normal rank

Latest revision as of 12:03, 27 July 2024

scientific article
Language Label Description Also known as
English
On the stability properties of gated recurrent units neural networks
scientific article

    Statements

    On the stability properties of gated recurrent units neural networks (English)
    0 references
    0 references
    0 references
    0 references
    14 December 2021
    0 references
    The paper provides sufficient sufficient conditions for input-to-state stability (ISS) and the incremental input-to-state stability (\(\delta\)ISS) of gated recurrent units (GRUs) neural networks. Both single-layer architecture \[ \begin{cases} x^+ = z \circ x + (1-z) \circ \phi(W_r u + U_r f \circ x + b_r)\\ z = \sigma(W_z u + U_z x + b_z)\\ f = \sigma(W_f u + U_f x + b_f)\\ y = U_0 x + b_0 \end{cases}, \] and its multi-layer counterpart are considered. Here \(x\in\mathbb R^{n_x}\), \(n_x \in \mathbb N\) is the state vector, \(u\in \mathbb R^{n_u}\), \(n_u \in \mathbb N\) is the input vector, \(y \in \mathbb R^{n_o}\), \(n_o \in \mathbb N\) is the output vector. The matrices \(W_*\), \(U_*\), and \(b_*\) are the weights and biases that parametrize the model. The derived sufficient stability conditions consist of nonlinear inequalities on the network's weights. They can be utilized in various ways, e.g., to verify stability of the trained network, or they can be enforced as constraints during the training procedure of a GRU. The resulting training procedure is tested on a quadruple tank nonlinear benchmark system.
    0 references
    neural networks
    0 references
    gated recurrent units
    0 references
    input-to-state stability
    0 references
    incremental input-to-state stability
    0 references
    0 references

    Identifiers