Two-layer networks with the \(\text{ReLU}^k\) activation function: Barron spaces and derivative approximation (Q6191372)
From MaRDI portal
scientific article; zbMATH DE number 7802489
Language | Label | Description | Also known as |
---|---|---|---|
English | Two-layer networks with the \(\text{ReLU}^k\) activation function: Barron spaces and derivative approximation |
scientific article; zbMATH DE number 7802489 |
Statements
Two-layer networks with the \(\text{ReLU}^k\) activation function: Barron spaces and derivative approximation (English)
0 references
9 February 2024
0 references
The paper studies the approximation of a function and its derivatives using a two-layer neural network. The activation function is chosen to be a rectified power unit \[ \mbox{ReLU}^k(x)=(\max(0,x))^k \] so that the approximation of derivatives is possible. By investigating the corresponding Barron space, the authors show that two-layer networks with the \(\mbox{ReLU}^k\) activation function are able to simultaneously approximate an unknown function and its derivatives. A Tikhonov type regularization method is proposed to fulfill the purpose. Error bounds are established and several numerical examples to support the efficiency of the proposed approach are provided.
0 references
ReLU neural networks
0 references
Tikhonov regularization
0 references
Barron spaces
0 references
derivative approximation
0 references