The following pages link to Arnulf Jentzen (Q298318):
Displaying 48 items.
- On the Alekseev-Gröbner formula in Banach spaces (Q2321118) (← links)
- Machine learning approximation algorithms for high-dimensional fully nonlinear partial differential equations and second-order backward stochastic differential equations (Q2327815) (← links)
- Loss of regularity for Kolmogorov equations (Q2338908) (← links)
- A Milstein scheme for SPDEs (Q2351803) (← links)
- On the differentiability of solutions of stochastic evolution equations with respect to their initial values (Q2402695) (← links)
- A random Euler scheme for Carathéodory differential equations (Q2519725) (← links)
- Space-time error estimates for deep neural network approximations for differential equations (Q2683168) (← links)
- Overcoming the curse of dimensionality in the numerical approximation of backward stochastic differential equations (Q2694433) (← links)
- An overview on deep learning-based approximation methods for partial differential equations (Q2697278) (← links)
- Numerical approximations of stochastic differential equations with non-globally Lipschitz continuous coefficients (Q2944996) (← links)
- Higher Order Pathwise Numerical Approximations of SPDEs with Additive Noise (Q3091818) (← links)
- (Q3101969) (← links)
- Strong and weak divergence in finite time of Euler's method for stochastic differential equations with non-globally Lipschitz continuous coefficients (Q3104819) (← links)
- An Exponential Wagner--Platen Type Scheme for SPDEs (Q3188304) (← links)
- Overcoming the order barrier in the numerical approximation of stochastic partial differential equations with additive space–time noise (Q3561862) (← links)
- (Q3639870) (← links)
- Exponential integrability properties of numerical approximation processes for nonlinear stochastic differential equations (Q4605703) (← links)
- On stochastic differential equations with arbitrarily slow convergence rates for strong approximation in two space dimensions (Q4646879) (← links)
- Galerkin Approximations for the Stochastic Burgers Equation (Q4918825) (← links)
- Deep Splitting Method for Parabolic PDEs (Q4958922) (← links)
- Strong error analysis for stochastic gradient descent optimization algorithms (Q4964091) (← links)
- Convergence in Hölder norms with applications to Monte Carlo methods in infinite dimensions (Q4964092) (← links)
- Solving high-dimensional partial differential equations using deep learning (Q4967451) (← links)
- (Q4969246) (← links)
- An Improved Maximum Allowable Transfer Interval for $L^{p}$-Stability of Networked Control Systems (Q4978674) (← links)
- Solving high-dimensional optimal stopping problems using deep learning (Q5014845) (← links)
- Algorithms for solving high dimensional PDEs: from nonlinear Monte Carlo to machine learning (Q5019943) (← links)
- On nonlinear Feynman–Kac formulas for viscosity solutions of semilinear parabolic partial differential equations (Q5021119) (← links)
- Analysis of the Generalization Error: Empirical Risk Minimization over Deep Artificial Neural Networks Overcomes the Curse of Dimensionality in the Numerical Approximation of Black--Scholes Partial Differential Equations (Q5037569) (← links)
- Full error analysis for the training of deep neural networks (Q5083408) (← links)
- Uniform error estimates for artificial neural network approximations for heat equations (Q5093099) (← links)
- Strong convergence rates for an explicit numerical approximation method for stochastic evolution equations with non-globally Lipschitz continuous nonlinearities (Q5109466) (← links)
- Overcoming the curse of dimensionality in the numerical approximation of semilinear parabolic partial differential equations (Q5161194) (← links)
- Numerical Simulations for Full History Recursive Multilevel Picard Approximations for Systems of High-Dimensional Partial Differential Equations (Q5162373) (← links)
- A UNIFIED EXISTENCE AND UNIQUENESS THEOREM FOR STOCHASTIC EVOLUTION EQUATIONS (Q5187719) (← links)
- A mild Itô formula for SPDEs (Q5234473) (← links)
- Deep optimal stopping (Q5381128) (← links)
- Pathwise convergent higher order numerical schemes for random ordinary differential equations (Q5443629) (← links)
- A Proof that Artificial Neural Networks Overcome the Curse of Dimensionality in the Numerical Approximation of Black–Scholes Partial Differential Equations (Q5889064) (← links)
- Mathematical Introduction to Deep Learning: Methods, Implementations, and Theory (Q5975790) (← links)
- An efficient Monte Carlo scheme for Zakai equations (Q6058696) (← links)
- Overall error analysis for the training of deep neural networks via stochastic gradient descent with random initialisation (Q6107984) (← links)
- Lower bounds for artificial neural network approximations: a proof that shallow neural networks fail to overcome the curse of dimensionality (Q6155895) (← links)
- Strong convergence rates for explicit space-time discrete numerical approximations of stochastic Allen-Cahn equations (Q6163565) (← links)
- Learning the random variables in Monte Carlo simulations with stochastic gradient descent: Machine learning for parametric PDEs and financial derivative pricing (Q6178392) (← links)
- Deep learning approximations for non-local nonlinear PDEs with Neumann boundary conditions (Q6204733) (← links)
- Local Lipschitz continuity in the initial value and strong completeness for nonlinear stochastic differential equations (Q6244988) (← links)
- A proof that artificial neural networks overcome the curse of dimensionality in the numerical approximation of Black-Scholes partial differential equations (Q6306376) (← links)