On the approximation of functions by tanh neural networks
From MaRDI portal
Publication:6055124
DOI10.1016/j.neunet.2021.08.015arXiv2104.08938MaRDI QIDQ6055124
Tim De Ryck, Siddhartha Mishra, Samuel Lanthaler
Publication date: 28 September 2023
Published in: Neural Networks (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2104.08938
Artificial neural networks and deep learning (68T07) Approximation by other special function classes (41A30)
Related Items (13)
Convergence of Physics-Informed Neural Networks Applied to Linear Second-Order Elliptic Interface Problems ⋮ Scientific machine learning through physics-informed neural networks: where we are and what's next ⋮ Variational physics informed neural networks: the role of quadratures and test functions ⋮ Wasserstein generative adversarial uncertainty quantification in physics-informed neural networks ⋮ Galerkin neural network approximation of singularly-perturbed elliptic systems ⋮ Symplectic learning for Hamiltonian neural networks ⋮ Physics-informed neural networks for approximating dynamic (hyperbolic) PDEs of second order in time: error analysis and algorithms ⋮ Higher-order error estimates for physics-informed neural networks approximating the primitive equations ⋮ Error convergence and engineering-guided hyperparameter search of PINNs: towards optimized I-FENN performance ⋮ The Mori-Zwanzig formulation of deep learning ⋮ wPINNs: Weak Physics Informed Neural Networks for Approximating Entropy Solutions of Hyperbolic Conservation Laws ⋮ Unnamed Item ⋮ Error analysis for physics-informed neural networks (PINNs) approximating Kolmogorov PDEs
Cites Work
- Unnamed Item
- Unnamed Item
- Approximation results for neural network operators activated by sigmoidal functions
- Multivariate neural network operators with sigmoidal activation functions
- Deep learning-based numerical methods for high-dimensional parabolic partial differential equations and backward stochastic differential equations
- Deep learning observables in computational fluid dynamics
- Approximation and estimation bounds for artificial neural networks
- Why does deep and cheap learning work so well?
- Hidden physics models: machine learning of nonlinear partial differential equations
- Multilayer feedforward networks are universal approximators
- Random approximants and neural networks
- Approximation rates for neural networks with general activation functions
- Exponential convergence of the deep neural network approximation for analytic functions
- Iterative surrogate model optimization (ISMO): an active learning algorithm for PDE constrained optimization with deep neural networks
- Efficient approximation of solutions of parametric linear transport equations by ReLU DNNs
- Error bounds for deep ReLU networks using the Kolmogorov-Arnold superposition theorem
- Approximation rates for neural networks with encodable weights in smoothness spaces
- On the approximation by single hidden layer feedforward neural networks with fixed weights
- Error bounds for approximations with deep ReLU networks
- Physics-informed neural networks: a deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations
- On the mathematical foundations of learning
- On Polynomial Approximation in Sobolev Spaces
- Fast Computation of Fourier Integral Operators
- Geometric Upper Bounds on Rates of Variable-Basis Approximation
- A Fast Butterfly Algorithm for the Computation of Fourier Integral Operators
- Combinatorial Multinomial Matrices and Multinomial Stirling Numbers
- Universal approximation bounds for superpositions of a sigmoidal function
- Deep learning in high dimension: Neural network expression rates for generalized polynomial chaos expansions in UQ
- A Multivariate Faa di Bruno Formula with Applications
- Solving high-dimensional partial differential equations using deep learning
- Enhancing Accuracy of Deep Learning Algorithms by Training with Low-Discrepancy Sequences
- New Error Bounds for Deep ReLU Networks Using Sparse Grids
- Deep ReLU networks and high-order finite element methods
- Error bounds for approximations with deep ReLU neural networks in Ws,p norms
- Deep neural network expression of posterior expectations in Bayesian PDE inversion
- Dependence of Computational Models on Input Dimension: Tractability of Approximation and Optimization Tasks
- Understanding Machine Learning
- Derivative Polynomials for tanh, tan, sech and sec in Explicit Form
- Approximation by superpositions of a sigmoidal function
This page was built for publication: On the approximation of functions by tanh neural networks