Two-layer networks with the \(\text{ReLU}^k\) activation function: Barron spaces and derivative approximation (Q6191372): Difference between revisions

From MaRDI portal
Added link to MaRDI item.
Set OpenAlex properties.
Property / full work available at URL
 
Property / full work available at URL: https://doi.org/10.1007/s00211-023-01384-6 / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W4388928889 / rank
 
Normal rank

Revision as of 09:09, 30 July 2024

scientific article; zbMATH DE number 7802489
Language Label Description Also known as
English
Two-layer networks with the \(\text{ReLU}^k\) activation function: Barron spaces and derivative approximation
scientific article; zbMATH DE number 7802489

    Statements

    Two-layer networks with the \(\text{ReLU}^k\) activation function: Barron spaces and derivative approximation (English)
    0 references
    0 references
    0 references
    0 references
    0 references
    9 February 2024
    0 references
    The paper studies the approximation of a function and its derivatives using a two-layer neural network. The activation function is chosen to be a rectified power unit \[ \mbox{ReLU}^k(x)=(\max(0,x))^k \] so that the approximation of derivatives is possible. By investigating the corresponding Barron space, the authors show that two-layer networks with the \(\mbox{ReLU}^k\) activation function are able to simultaneously approximate an unknown function and its derivatives. A Tikhonov type regularization method is proposed to fulfill the purpose. Error bounds are established and several numerical examples to support the efficiency of the proposed approach are provided.
    0 references
    ReLU neural networks
    0 references
    Tikhonov regularization
    0 references
    Barron spaces
    0 references
    derivative approximation
    0 references

    Identifiers