Pages that link to "Item:Q5025773"
From MaRDI portal
The following pages link to Optimal Approximation with Sparsely Connected Deep Neural Networks (Q5025773):
Displaying 45 items.
- Error bounds for approximations with deep ReLU neural networks in Ws,p norms (Q5132228) (← links)
- Rectified deep neural networks overcome the curse of dimensionality for nonsmooth value functions in zero-sum games of nonlinear stiff systems (Q5132232) (← links)
- Deep Network Approximation for Smooth Functions (Q5155613) (← links)
- Learning on dynamic statistical manifolds (Q5161017) (← links)
- Butterfly-Net: Optimal Function Representation Based on Convolutional Neural Networks (Q5162362) (← links)
- Deep Nitsche Method: Deep Ritz Method with Essential Boundary Conditions (Q5163229) (← links)
- Equivalence of approximation by convolutional neural networks and fully-connected networks (Q5218202) (← links)
- Solving inverse problems using data-driven models (Q5230520) (← links)
- Deep hedging (Q5234357) (← links)
- BOUNDS ON MULTI-ASSET DERIVATIVES VIA NEURAL NETWORKS (Q5854317) (← links)
- Spline representation and redundancies of one-dimensional ReLU neural network models (Q5873929) (← links)
- Expressivity of Deep Neural Networks (Q5879776) (← links)
- Neural network approximation (Q5887830) (← links)
- A Proof that Artificial Neural Networks Overcome the Curse of Dimensionality in the Numerical Approximation of Black–Scholes Partial Differential Equations (Q5889064) (← links)
- Approximation bounds for norm constrained neural networks with applications to regression and GANs (Q6038825) (← links)
- Simultaneous neural network approximation for smooth functions (Q6052416) (← links)
- A deep network construction that adapts to intrinsic dimensionality beyond the domain (Q6054952) (← links)
- Theory of deep convolutional neural networks. III: Approximating radial functions (Q6055154) (← links)
- Randomized neural network with Petrov-Galerkin methods for solving linear and nonlinear partial differential equations (Q6058946) (← links)
- Rates of approximation by ReLU shallow neural networks (Q6062171) (← links)
- An introduction to the mathematics of deep learning (Q6064555) (← links)
- Three ways to solve partial differential equations with neural networks — A review (Q6068232) (← links)
- Deep dynamic modeling with just two time points: Can we still allow for individual trajectories? (Q6068870) (← links)
- Deep learning methods for partial differential equations and related parameter identification problems (Q6070739) (← links)
- On decision regions of narrow deep neural networks (Q6078750) (← links)
- Sparsity in long-time control of neural ODEs (Q6099693) (← links)
- Overall error analysis for the training of deep neural networks via stochastic gradient descent with random initialisation (Q6107984) (← links)
- Limitations of neural network training due to numerical instability of backpropagation (Q6122651) (← links)
- SignReLU neural network and its approximation ability (Q6126040) (← links)
- Invariant spectral foliations with applications to model order reduction and synthesis (Q6132386) (← links)
- A multivariate Riesz basis of ReLU neural networks (Q6144893) (← links)
- Lower bounds for artificial neural network approximations: a proof that shallow neural networks fail to overcome the curse of dimensionality (Q6155895) (← links)
- Neural network approximation and estimation of classifiers with classification boundary in a Barron class (Q6165247) (← links)
- Learning ability of interpolating deep convolutional neural networks (Q6185680) (← links)
- The mathematics of artificial intelligence (Q6200206) (← links)
- Approximation in shift-invariant spaces with deep ReLU neural networks (Q6341347) (← links)
- Integral representations of shallow neural network with Rectified Power Unit activation function (Q6386309) (← links)
- Approximation of smooth functionals using deep ReLU networks (Q6488836) (← links)
- Approximation analysis of CNNs from a feature extraction view (Q6496341) (← links)
- Low dimensional approximation and generalization of multivariate functions on smooth manifolds using deep ReLU neural networks (Q6536393) (← links)
- Proof of the theory-to-practice gap in deep learning via sampling complexity bounds for neural network approximation spaces (Q6592113) (← links)
- Neural and spectral operator surrogates: unified construction and expression rate bounds (Q6601288) (← links)
- Approximation rates for deep calibration of (rough) stochastic volatility models (Q6606848) (← links)
- Sampling complexity of deep approximation spaces (Q6649919) (← links)
- Weighted variation spaces and approximation by shallow ReLU networks (Q6652573) (← links)