A New Function Space from Barron Class and Application to Neural Network Approximation
From MaRDI portal
Publication:5878925
DOI10.4208/cicp.OA-2022-0151zbMath1505.65294MaRDI QIDQ5878925
Publication date: 23 February 2023
Published in: Communications in Computational Physics (Search for Journal in Brave)
Artificial neural networks and deep learning (68T07) Boundary value problems for second-order elliptic equations (35J25) Finite element, Rayleigh-Ritz and Galerkin methods for boundary value problems involving PDEs (65N30) Stability and convergence of numerical methods for initial value and initial-boundary value problems involving PDEs (65M12) Approximation by arbitrary nonlinear expressions; widths and entropy (41A46)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Growth and integrability of Fourier transforms on Euclidean space
- Global well-posedness for Keller-Segel system in Besov type spaces
- Theory of function spaces
- Symétrie et compacité dans les espaces de Sobolev
- A simple lemma on greedy approximation in Hilbert space and convergence rates for projection pursuit regression and neural network training
- Approximation by superposition of sigmoidal and radial basis functions
- Existence of solitary waves in higher dimensions
- Uniform approximation by neural networks
- Generalization bounds for function approximation from scattered noisy data
- Rates of convex approximation in non-Hilbert spaces
- On Gagliardo-Nirenberg type inequalities in Fourier-Herz spaces
- Provable approximation properties for deep neural networks
- Approximation rates for neural networks with general activation functions
- Representation formulas and pointwise properties for Barron functions
- Approximation spaces of deep neural networks
- The Barron space and the flow-induced function spaces for neural network models
- High-order approximation rates for shallow neural networks with cosine and \(\mathrm{ReLU}^k\) activation functions
- On dispersive effect of the Coriolis force for the stationary Navier-Stokes equations
- Optimal constants in the Marcinkiewicz-Zygmund inequalities
- Approximation and learning by greedy algorithms
- Growth properties of the Fourier transform
- Cube Slicing in R n
- The best constants in the Khintchine inequality
- Approximation by Ridge Functions and Neural Networks
- Universal approximation bounds for superpositions of a sigmoidal function
- Hinging hyperplanes for regression, classification, and function approximation
- Degree of Approximation Results for Feedforward Networks Approximating Unknown Mappings and Their Derivatives
- Estimates of Fourier transforms in Sobolev spaces
- Approximation by Combinations of ReLU and Squared ReLU Ridge Functions With <inline-formula> <tex-math notation="LaTeX">$\ell^1$ </tex-math> </inline-formula> and <inline-formula> <tex-math notation="LaTeX">$\ell^0$ </tex-math> </inline-formula> Controls
- Convergence Rate Analysis for Deep Ritz Method
- A Rate of Convergence of Physics Informed Neural Networks for the Linear Second Order Elliptic PDEs
- Deep Network Approximation Characterized by Number of Neurons
- Deep Nitsche Method: Deep Ritz Method with Essential Boundary Conditions
- Global mild solutions of Navier‐Stokes equations
- Upper bound on √x Jν(x) and its applications
- On the Theory of Dynamic Programming
- Neural network approximation
- Ridge functions and orthonormal ridgelets
This page was built for publication: A New Function Space from Barron Class and Application to Neural Network Approximation