Representation formulas and pointwise properties for Barron functions
From MaRDI portal
(Redirected from Publication:2113295)
Artificial neural networks and deep learning (68T07) Approximation by other special function classes (41A30) Banach spaces of continuous, differentiable or analytic functions (46E15) Special properties of functions of several variables, Hölder conditions, etc. (26B35) Representation and superposition of functions (26B40)
Abstract: We study the natural function space for infinitely wide two-layer neural networks with ReLU activation (Barron space) and establish different representation formulae. In two cases, we describe the space explicitly up to isomorphism. Using a convenient representation, we study the pointwise properties of two-layer networks and show that functions whose singular set is fractal or curved (for example distance functions from smooth submanifolds) cannot be represented by infinitely wide two-layer networks with finite path-norm. We use this structure theorem to show that the only -diffeomorphisms which Barron space are affine. Furthermore, we show that every Barron function can be decomposed as the sum of a bounded and a positively one-homogeneous function and that there exist Barron functions which decay rapidly at infinity and are globally Lebesgue-integrable. This result suggests that two-layer neural networks may be able to approximate a greater variety of functions than commonly believed.
Recommendations
- A New Function Space from Barron Class and Application to Neural Network Approximation
- Understanding neural networks with reproducing kernel Banach spaces
- Provable approximation properties for deep neural networks
- Banach space representer theorems for neural networks and ridge splines
- A global universality of two-layer neural networks with ReLU activations
Cites work
- scientific article; zbMATH DE number 5263198 (Why is no real title available?)
- A mean field view of the landscape of two-layer neural networks
- A priori estimates of the population risk for two-layer neural networks
- Applied functional analysis. Functional analysis, Sobolev spaces and elliptic differential equations
- Approximation by Combinations of ReLU and Squared ReLU Ridge Functions With <inline-formula> <tex-math notation="LaTeX">$\ell^1$ </tex-math> </inline-formula> and <inline-formula> <tex-math notation="LaTeX">$\ell^0$ </tex-math> </inline-formula> Controls
- Approximation by superpositions of a sigmoidal function
- Breaking the curse of dimensionality with convex neural networks
- Functional analysis, Sobolev spaces and partial differential equations
- Hinging hyperplanes for regression, classification, and function approximation
- Kolmogorov width decay and poor approximators in machine learning: shallow neural networks, random feature models and neural tangent kernels
- Machine learning from a continuous viewpoint. I
- Mean field analysis of neural networks: a law of large numbers
- Measure theory and fine properties of functions
- The Variational Formulation of the Fokker--Planck Equation
- Universal approximation bounds for superpositions of a sigmoidal function
- Variational Analysis in Sobolev and BV Spaces
- Wahrscheinlichkeitstheorie
Cited in
(20)- Applied harmonic analysis and data science. Abstracts from the workshop held April 21--26, 2024
- Nonlinear weighted directed acyclic graph and a priori estimates for neural networks
- Numerical solution of Poisson partial differential equation in high dimension using two-layer neural networks
- A New Function Space from Barron Class and Application to Neural Network Approximation
- Uniform approximation rates and metric entropy of shallow neural networks
- Embeddings between Barron spaces with higher-order activation functions
- The Barron space and the flow-induced function spaces for neural network models
- Numerical analysis of physics-informed neural networks and related models in physics-informed machine learning
- Deep Ritz method for the spectral fractional Laplacian equation using the Caffarelli-Silvestre extension
- Neural network approximation and estimation of classifiers with classification boundary in a Barron class
- Piecewise linear functions representable with infinite width shallow ReLU neural networks
- Low dimensional approximation and generalization of multivariate functions on smooth manifolds using deep ReLU neural networks
- Solving the regularized Schamel equation by the singular planar dynamical system method and the deep learning method
- A convergent deep learning algorithm for approximation of polynomials
- Greedy training algorithms for neural networks and applications to PDEs
- Understanding neural networks with reproducing kernel Banach spaces
- Deep Neural Networks for Solving Large Linear Systems Arising from High-Dimensional Problems
- Two-Layer Neural Networks with Values in a Banach Space
- A priori generalization error analysis of two-layer neural networks for solving high dimensional Schrödinger eigenvalue problems
- Two-layer networks with the \(\text{ReLU}^k\) activation function: Barron spaces and derivative approximation
This page was built for publication: Representation formulas and pointwise properties for Barron functions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2113295)