Exponential convergence of the deep neural network approximation for analytic functions
From MaRDI portal
Abstract: We prove that for analytic functions in low dimension, the convergence rate of the deep neural network approximation is exponential.
Cites work
Cited in
(42)- Full error analysis for the training of deep neural networks
- Constructive deep ReLU neural network approximation
- Exponential ReLU DNN expression of holomorphic maps in high dimension
- Deep Neural Networks for Solving Large Linear Systems Arising from High-Dimensional Problems
- PDE-Net 2.0: learning PDEs from data with a numeric-symbolic hybrid deep network
- N-adaptive Ritz method: a neural network enriched partition of unity for boundary value problems
- MgNet: a unified framework of multigrid and convolutional neural network
- Rectified deep neural networks overcome the curse of dimensionality for nonsmooth value functions in zero-sum games of nonlinear stiff systems
- A note on the applications of one primary function in deep neural networks
- Optimization of random feature method in the high-precision regime
- Collocation approximation by deep neural ReLU networks for parametric and stochastic PDEs with lognormal inputs
- Computation and learning in high dimensions. Abstracts from the workshop held August 1--7, 2021 (hybrid meeting)
- Structure probing neural network deflation
- Deep Network Approximation for Smooth Functions
- Selection dynamics for deep neural networks
- Analytic continuation of noisy data using Adams Bashforth residual neural network
- SelectNet: self-paced learning for high-dimensional partial differential equations
- On the approximation of functions by tanh neural networks
- Deep network with approximation error being reciprocal of width to power of square root of depth
- Deep ReLU neural networks in high-dimensional approximation
- Deep ReLU Networks Overcome the Curse of Dimensionality for Generalized Bandlimited Functions
- Improved training of physics-informed neural networks for parabolic differential equations with sharply perturbed initial conditions
- Deep ReLU neural network approximation in Bochner spaces and applications to parametric PDEs
- Convergence of deep convolutional neural networks
- Overall error analysis for the training of deep neural networks via stochastic gradient descent with random initialisation
- Nonlinear approximation and (deep) ReLU networks
- ReLU deep neural networks from the hierarchical basis perspective
- Nonlinear approximation via compositions
- On mathematical modeling in image reconstruction and beyond
- Better approximations of high dimensional smooth functions by deep neural networks with rectified power units
- Neural network approximation
- Sparse approximation of triangular transports. I: The finite-dimensional case
- Computing ground states of Bose-Einstein condensation by normalized deep neural network
- The gap between theory and practice in function approximation with deep neural networks
- Optimal approximation rate of ReLU networks in terms of width and depth
- Solving PDEs on unknown manifolds with machine learning
- Simultaneous neural network approximation for smooth functions
- Non-homogeneous Poisson process intensity modeling and estimation using measure transport
- Deep Neural Networks with ReLU-Sine-Exponential Activations Break Curse of Dimensionality in Approximation on Hölder Class
- Deep ReLU networks and high-order finite element methods
- Neural network expression rates and applications of the deep parametric PDE method in counterparty credit risk
- Response to stress via underlying deep gene regulation networks
This page was built for publication: Exponential convergence of the deep neural network approximation for analytic functions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1989902)