On the stability properties of gated recurrent units neural networks (Q2059489): Difference between revisions
From MaRDI portal
Latest revision as of 12:03, 27 July 2024
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | On the stability properties of gated recurrent units neural networks |
scientific article |
Statements
On the stability properties of gated recurrent units neural networks (English)
0 references
14 December 2021
0 references
The paper provides sufficient sufficient conditions for input-to-state stability (ISS) and the incremental input-to-state stability (\(\delta\)ISS) of gated recurrent units (GRUs) neural networks. Both single-layer architecture \[ \begin{cases} x^+ = z \circ x + (1-z) \circ \phi(W_r u + U_r f \circ x + b_r)\\ z = \sigma(W_z u + U_z x + b_z)\\ f = \sigma(W_f u + U_f x + b_f)\\ y = U_0 x + b_0 \end{cases}, \] and its multi-layer counterpart are considered. Here \(x\in\mathbb R^{n_x}\), \(n_x \in \mathbb N\) is the state vector, \(u\in \mathbb R^{n_u}\), \(n_u \in \mathbb N\) is the input vector, \(y \in \mathbb R^{n_o}\), \(n_o \in \mathbb N\) is the output vector. The matrices \(W_*\), \(U_*\), and \(b_*\) are the weights and biases that parametrize the model. The derived sufficient stability conditions consist of nonlinear inequalities on the network's weights. They can be utilized in various ways, e.g., to verify stability of the trained network, or they can be enforced as constraints during the training procedure of a GRU. The resulting training procedure is tested on a quadruple tank nonlinear benchmark system.
0 references
neural networks
0 references
gated recurrent units
0 references
input-to-state stability
0 references
incremental input-to-state stability
0 references
0 references
0 references