Exponential ReLU DNN expression of holomorphic maps in high dimension
DOI10.1007/S00365-021-09542-5zbMATH Open1500.41008OpenAlexW3161792008WikidataQ115607850 ScholiaQ115607850MaRDI QIDQ2117341FDOQ2117341
Joost A. A. Opschoor, J. Zech, Christoph Schwab
Publication date: 21 March 2022
Published in: Constructive Approximation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s00365-021-09542-5
Recommendations
- Exponential ReLU neural network approximation rates for point and edge singularities
- Deep Learning in High Dimension: Neural Network Expression Rates for Analytic Functions in \(\pmb{L^2(\mathbb{R}^d,\gamma_d)}\)
- Exponential convergence of the deep neural network approximation for analytic functions
- Holomorphic feedforward networks
- Deep ReLU neural networks in high-dimensional approximation
- Deep Neural Networks with ReLU-Sine-Exponential Activations Break Curse of Dimensionality in Approximation on Hölder Class
- Nonlinear approximation and (deep) ReLU networks
- Deep ReLU neural network approximation in Bochner spaces and applications to parametric PDEs
- Deep learning in high dimension: neural network expression rates for generalized polynomial chaos expansions in UQ
- Exponential convergence for high-order recurrent neural networks with a class of general activation functions
Artificial neural networks and deep learning (68T07) Multidimensional problems (41A63) Approximation by arbitrary nonlinear expressions; widths and entropy (41A46)
Cites Work
- Optimal transport for applied mathematicians. Calculus of variations, PDEs, and modeling
- \(hp\)-dGFEM for second order elliptic problems in polyhedra. II: Exponential convergence
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Exponential convergence in \(H^1\) of \textit{hp}-FEM for Gevrey regularity with isotropic singularities
- Title not available (Why is that?)
- Multilayer feedforward networks are universal approximators
- An Anisotropic Sparse Grid Stochastic Collocation Method for Partial Differential Equations with Random Input Data
- Analysis of quasi-optimal polynomial approximations for parameterized PDEs with deterministic and stochastic coefficients
- Convergence rates of best \(N\)-term Galerkin approximations for a class of elliptic SPDEs
- Gevrey class regularity for the solutions of the Navier-Stokes equations
- Title not available (Why is that?)
- Approximation by Ridge functions and neural networks with one hidden layer
- Approximation by superposition of sigmoidal and radial basis functions
- An upper estimate of integral points in real simplices with an application to singularity theory
- EXPONENTIAL CONVERGENCE OF hp-FEM FOR MAXWELL EQUATIONS WITH WEIGHTED REGULARIZATION IN POLYGONAL DOMAINS
- Breaking the curse of dimensionality in sparse polynomial approximation of parametric PDEs
- Multilevel approximation of parametric and stochastic PDES
- Approximation properties of a multilayered feedforward artificial neural network
- Optimal approximation of piecewise smooth functions using deep ReLU neural networks
- Error bounds for approximations with deep ReLU networks
- Deep learning in high dimension: Neural network expression rates for generalized polynomial chaos expansions in UQ
- Optimal Approximation with Sparsely Connected Deep Neural Networks
- Polynomial approximation of anisotropic analytic functions of several variables
- Exponential convergence of the deep neural network approximation for analytic functions
- Deep ReLU networks and high-order finite element methods
- Better Approximations of High Dimensional Smooth Functions by Deep Neural Networks with Rectified Power Units
- Convergence rates of high dimensional Smolyak quadrature
- Deep neural network expression of posterior expectations in Bayesian PDE inversion
Cited In (29)
- Deep ReLU network expression rates for option prices in high-dimensional, exponential Lévy models
- Solving parametric partial differential equations with deep rectified quadratic unit neural networks
- Exponential Convergence of Deep Operator Networks for Elliptic Partial Differential Equations
- Collocation approximation by deep neural ReLU networks for parametric and stochastic PDEs with lognormal inputs
- Optimal approximation of infinite-dimensional holomorphic functions
- Computation and learning in high dimensions. Abstracts from the workshop held August 1--7, 2021 (hybrid meeting)
- Learning homogenization for elliptic operators
- Deep ReLU networks and high-order finite element methods. II: Chebyšev emulation
- Structure probing neural network deflation
- Higher-Order Quasi-Monte Carlo Training of Deep Neural Networks
- Approximation theory of tree tensor networks: tensorized univariate functions
- Neural and spectral operator surrogates: unified construction and expression rate bounds
- Convergence rate of DeepONets for learning operators arising from advection-diffusion equations
- De Rham compatible deep neural network FEM
- Deep ReLU neural network approximation in Bochner spaces and applications to parametric PDEs
- Approximation error for neural network operators by an averaged modulus of smoothness
- Deep solution operators for variational inequalities via proximal neural networks
- Physics Informed Neural Networks (PINNs) For Approximating Nonlinear Dispersive PDEs
- Parametric shape holomorphy of boundary integral operators with applications
- ReLU deep neural networks from the hierarchical basis perspective
- Neural network approximation
- Sparse approximation of triangular transports. I: The finite-dimensional case
- Exploiting locality in sparse polynomial approximation of parametric elliptic PDEs and application to parameterized domains
- Deep ReLU neural networks overcome the curse of dimensionality for partial integrodifferential equations
- Exponential ReLU neural network approximation rates for point and edge singularities
- Deep Learning in High Dimension: Neural Network Expression Rates for Analytic Functions in \(\pmb{L^2(\mathbb{R}^d,\gamma_d)}\)
- Deep Neural Networks with ReLU-Sine-Exponential Activations Break Curse of Dimensionality in Approximation on Hölder Class
- Variational physics informed neural networks: the role of quadratures and test functions
- Neural network expression rates and applications of the deep parametric PDE method in counterparty credit risk
This page was built for publication: Exponential ReLU DNN expression of holomorphic maps in high dimension
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2117341)