Approximation rates for neural networks with general activation functions

From MaRDI portal
Publication:1982446

DOI10.1016/j.neunet.2020.05.019zbMath1480.41007arXiv1904.02311OpenAlexW3027734040WikidataQ96018643 ScholiaQ96018643MaRDI QIDQ1982446

Jonathan W. Siegel, Jin-Chao Xu

Publication date: 8 September 2021

Published in: Neural Networks (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1904.02311




Related Items (39)

Stationary Density Estimation of Itô Diffusions Using Deep LearningDeep Adaptive Basis Galerkin Method for High-Dimensional Evolution Equations With Oscillatory SolutionsLeast-squares ReLU neural network (LSNN) method for linear advection-reaction equationA mesh-free method using piecewise deep neural network for elliptic interface problemsApproximation properties of deep ReLU CNNsReLU deep neural networks from the hierarchical basis perspectiveRandomized Newton's method for solving differential equations based on the neural network discretizationConvergence Rate Analysis for Deep Ritz MethodHomPINNs: Homotopy physics-informed neural networks for learning multiple solutions of nonlinear elliptic differential equationsDeep Ritz Method for the Spectral Fractional Laplacian Equation Using the Caffarelli--Silvestre ExtensionWasserstein generative adversarial uncertainty quantification in physics-informed neural networksSimultaneous neural network approximation for smooth functionsFramework for segmented threshold \(\ell_0\) gradient approximation based network for sparse signal recoveryDeep Neural Networks for Solving Large Linear Systems Arising from High-Dimensional ProblemsOn the approximation of functions by tanh neural networksDivide-and-conquer DNN approach for the inverse point source problem using a few single frequency measurementsA priori generalization error analysis of two-layer neural networks for solving high dimensional Schrödinger eigenvalue problemsGalerkin neural network approximation of singularly-perturbed elliptic systemsCharacterization of the variation spaces corresponding to shallow neural networksUniversal regular conditional distributions via probabilistic transformersFriedrichs Learning: Weak Solutions of Partial Differential Equations via Deep LearningActive learning based sampling for high-dimensional nonlinear partial differential equationsA mathematical perspective of machine learningUnnamed ItemDeep Neural Networks with ReLU-Sine-Exponential Activations Break Curse of Dimensionality in Approximation on Hölder ClassNeural network approximation and estimation of classifiers with classification boundary in a Barron classA priori error estimate of deep mixed residual method for elliptic PDEsConvergence Analysis of the Deep Galerkin Method for Weak SolutionsDesigning universal causal deep learning models: The geometric (Hyper)transformerNeural network stochastic differential equation models with applications to financial data forecastingGreedy training algorithms for neural networks and applications to PDEsFinite Neuron Method and Convergence AnalysisConvergence analysis of neural networks for solving a free boundary problemUnnamed ItemOptimal approximation rate of ReLU networks in terms of width and depthConstruct Deep Neural Networks based on Direct Sampling Methods for Solving Electrical Impedance TomographyTwo neural-network-based methods for solving elliptic obstacle problemsHigh-order approximation rates for shallow neural networks with cosine and \(\mathrm{ReLU}^k\) activation functionsA New Function Space from Barron Class and Application to Neural Network Approximation



Cites Work


This page was built for publication: Approximation rates for neural networks with general activation functions