Deep learning in high dimension: neural network expression rates for generalized polynomial chaos expansions in UQ
From MaRDI portal
Publication:4615657
DOI10.1142/S0219530518500203zbMATH Open1478.68309OpenAlexW2883486956WikidataQ129516912 ScholiaQ129516912MaRDI QIDQ4615657FDOQ4615657
Publication date: 29 January 2019
Published in: Analysis and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1142/s0219530518500203
Recommendations
- Deep learning in high dimension: ReLU neural network expression for Bayesian PDE inversion
- Exponential ReLU DNN expression of holomorphic maps in high dimension
- Deep UQ: learning deep neural network surrogate models for high dimensional uncertainty quantification
- Deep neural network expression of posterior expectations in Bayesian PDE inversion
- Proof that deep artificial neural networks overcome the curse of dimensionality in the numerical approximation of Kolmogorov partial differential equations with constant diffusion and nonlinear drift coefficients
Cites Work
- Universal approximation bounds for superpositions of a sigmoidal function
- Inverse problems: a Bayesian perspective
- ANALYTIC REGULARITY AND POLYNOMIAL APPROXIMATION OF PARAMETRIC AND STOCHASTIC ELLIPTIC PDE'S
- High-dimensional adaptive sparse polynomial interpolation and applications to parametric PDEs
- Multilayer feedforward networks are universal approximators
- Title not available (Why is that?)
- Convergence rates of best \(N\)-term Galerkin approximations for a class of elliptic SPDEs
- Title not available (Why is that?)
- Convergence rates of multilevel and sparse tensor approximations for a random elliptic PDE
- Approximation of high-dimensional parametric PDEs
- Analyticity in infinite dimensional spaces
- Sparse deterministic approximation of Bayesian inverse problems
- Numerical solution of parametrized Navier–Stokes equations by reduced basis methods
- Breaking the curse of dimensionality in sparse polynomial approximation of parametric PDEs
- Sparse polynomial approximation of parametric elliptic PDEs. Part I: affine coefficients
- Title not available (Why is that?)
- Multilevel approximation of parametric and stochastic PDES
- Fully Discrete Approximation of Parametric and Stochastic Elliptic PDEs
- Electromagnetic wave scattering by random surfaces: Shape holomorphy
- Machine learning approximation algorithms for high-dimensional fully nonlinear partial differential equations and second-order backward stochastic differential equations
- Error bounds for approximations with deep ReLU networks
- Differential operators on domains with conical points: precise uniform regularity estimates
- Deep UQ: learning deep neural network surrogate models for high dimensional uncertainty quantification
- Optimal Approximation with Sparsely Connected Deep Neural Networks
- Multilevel higher-order quasi-Monte Carlo Bayesian estimation
- Shape Holomorphy of the Stationary Navier--Stokes Equations
- Deep vs. shallow networks: an approximation theory perspective
- Convergence rates of high dimensional Smolyak quadrature
Cited In (75)
- Deep ReLU network expression rates for option prices in high-dimensional, exponential Lévy models
- Solving high-dimensional Hamilton-Jacobi-Bellman PDEs using neural networks: perspectives from the theory of controlled diffusions and measures on path space
- Title not available (Why is that?)
- CAS4DL: Christoffel adaptive sampling for function approximation via deep learning
- Constructive deep ReLU neural network approximation
- Approximation rates for neural networks with encodable weights in smoothness spaces
- Solving parametric partial differential equations with deep rectified quadratic unit neural networks
- DNN expression rate analysis of high-dimensional PDEs: application to option pricing
- Exponential ReLU DNN expression of holomorphic maps in high dimension
- Exponential Convergence of Deep Operator Networks for Elliptic Partial Differential Equations
- Non-Intrusive Reduced Order Modeling of Convection Dominated Flows Using Artificial Neural Networks with Application to Rayleigh-Taylor Instability
- Robust randomized optimization with k nearest neighbors
- Rectified deep neural networks overcome the curse of dimensionality for nonsmooth value functions in zero-sum games of nonlinear stiff systems
- Deep Neural Network Surrogates for Nonsmooth Quantities of Interest in Shape Uncertainty Quantification
- Domain Uncertainty Quantification in Computational Electromagnetics
- Sparse polynomial approximations for affine parametric saddle point problems
- The Random Feature Model for Input-Output Maps between Banach Spaces
- On the approximation of functions by tanh neural networks
- Deep ReLU neural networks in high-dimensional approximation
- Balanced joint maximum mean discrepancy for deep transfer learning
- The Gap between Theory and Practice in Function Approximation with Deep Neural Networks
- Numerical solution of the parametric diffusion equation by deep neural networks
- Model reduction and neural networks for parametric PDEs
- Deep neural network expression of posterior expectations in Bayesian PDE inversion
- Stein variational gradient descent with local approximations
- Efficient approximation of solutions of parametric linear transport equations by ReLU DNNs
- ReLU neural network Galerkin BEM
- Convergence rate of DeepONets for learning operators arising from advection-diffusion equations
- Surrogate modeling for Bayesian inverse problems based on physics-informed neural networks
- Neural network approximation and estimation of classifiers with classification boundary in a Barron class
- De Rham compatible deep neural network FEM
- Deep ReLU neural network approximation in Bochner spaces and applications to parametric PDEs
- Overall error analysis for the training of deep neural networks via stochastic gradient descent with random initialisation
- Multilevel approximation of parametric and stochastic PDES
- Probabilistic partition of unity networks for high‐dimensional regression problems
- An Adaptive Surrogate Modeling Based on Deep Neural Networks for Large-Scale Bayesian Inverse Problems
- Data-driven forward discretizations for Bayesian inversion
- Deep learning-based approximation of Goldbach partition function
- Nonlinear approximation and (deep) ReLU networks
- A deep learning approach to Reduced Order Modelling of parameter dependent partial differential equations
- Title not available (Why is that?)
- Derivative-informed projected neural networks for high-dimensional parametric maps governed by PDEs
- A theoretical analysis of deep neural networks and parametric PDEs
- Data driven approximation of parametrized PDEs by reduced basis and neural networks
- Simultaneous neural network approximation for smooth functions
- Analysis of the Generalization Error: Empirical Risk Minimization over Deep Artificial Neural Networks Overcomes the Curse of Dimensionality in the Numerical Approximation of Black--Scholes Partial Differential Equations
- Exponential ReLU neural network approximation rates for point and edge singularities
- Deep Learning in High Dimension: Neural Network Expression Rates for Analytic Functions in \(\pmb{L^2(\mathbb{R}^d,\gamma_d)}\)
- Deep ReLU networks and high-order finite element methods
- Error bounds for approximations with deep ReLU neural networks in Ws,p norms
- Error analysis for physics-informed neural networks (PINNs) approximating Kolmogorov PDEs
- Full error analysis for the training of deep neural networks
- Collocation approximation by deep neural ReLU networks for parametric and stochastic PDEs with lognormal inputs
- Optimal approximation of infinite-dimensional holomorphic functions
- Deep ReLU networks and high-order finite element methods. II: Chebyšev emulation
- Operator learning using random features: a tool for scientific computing
- Error analysis for deep neural network approximations of parametric hyperbolic conservation laws
- One-shot learning of surrogates in PDE-constrained optimization under uncertainty
- Wavenumber-Explicit Parametric Holomorphy of Helmholtz Solutions in the Context of Uncertainty Quantification
- Numerical analysis of physics-informed neural networks and related models in physics-informed machine learning
- Neural and spectral operator surrogates: unified construction and expression rate bounds
- Multilevel domain uncertainty quantification in computational electromagnetics
- On the spectral bias of coupled frequency predictor-corrector triangular DNN: the convergence analysis
- On the latent dimension of deep autoencoders for reduced order modeling of PDEs parametrized by random fields
- Energy-dissipative evolutionary deep operator neural networks
- Parametric shape holomorphy of boundary integral operators with applications
- Optimal Dirichlet boundary control by Fourier neural operators applied to nonlinear optics
- wPINNs: Weak Physics Informed Neural Networks for Approximating Entropy Solutions of Hyperbolic Conservation Laws
- Error assessment of an adaptive finite elements -- neural networks method for an elliptic parametric PDE
- Exploiting locality in sparse polynomial approximation of parametric elliptic PDEs and application to parameterized domains
- Learning high frequency data via the coupled frequency predictor-corrector triangular DNN
- Adaptive operator learning for infinite-dimensional Bayesian inverse problems
- Convergence Rates for Learning Linear Operators from Noisy Data
- Limitations of neural network training due to numerical instability of backpropagation
- Neural network expression rates and applications of the deep parametric PDE method in counterparty credit risk
This page was built for publication: Deep learning in high dimension: neural network expression rates for generalized polynomial chaos expansions in UQ
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4615657)