ReLU deep neural networks from the hierarchical basis perspective
DOI10.1016/j.camwa.2022.06.006OpenAlexW3162222048MaRDI QIDQ2159911
Lin Li, Juncai He, Jin-Chao Xu
Publication date: 2 August 2022
Published in: Computers \& Mathematics with Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2105.04156
Artificial neural networks and deep learning (68T07) Learning and adaptive systems in artificial intelligence (68T05) Finite element, Rayleigh-Ritz and Galerkin methods for boundary value problems involving PDEs (65N30) Rate of convergence, degree of approximation (41A25) Algorithms for approximation of functions (65D15)
Related Items (2)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A simple lemma on greedy approximation in Hilbert space and convergence rates for projection pursuit regression and neural network training
- DeepONet
- Multilayer feedforward networks are universal approximators
- Approximation rates for neural networks with general activation functions
- Exponential convergence of the deep neural network approximation for analytic functions
- Exponential ReLU DNN expression of holomorphic maps in high dimension
- Error bounds for approximations with deep ReLU networks
- Relu Deep Neural Networks and Linear Finite Elements
- Universal approximation bounds for superpositions of a sigmoidal function
- Approximation by Combinations of ReLU and Squared ReLU Ridge Functions With <inline-formula> <tex-math notation="LaTeX">$\ell^1$ </tex-math> </inline-formula> and <inline-formula> <tex-math notation="LaTeX">$\ell^0$ </tex-math> </inline-formula> Controls
- New Error Bounds for Deep ReLU Networks Using Sparse Grids
- Deep ReLU Networks Overcome the Curse of Dimensionality for Generalized Bandlimited Functions
- Deep ReLU networks and high-order finite element methods
- Error bounds for approximations with deep ReLU neural networks in Ws,p norms
- DeepXDE: A Deep Learning Library for Solving Differential Equations
- Deep Network Approximation for Smooth Functions
- Finite Neuron Method and Convergence Analysis
- Sparse grids
- Approximation by superpositions of a sigmoidal function
- Exponential ReLU neural network approximation rates for point and edge singularities
This page was built for publication: ReLU deep neural networks from the hierarchical basis perspective