Exponential convergence of the deep neural network approximation for analytic functions

From MaRDI portal
Publication:1989902

DOI10.1007/s11425-018-9387-xzbMath1475.65007arXiv1807.00297OpenAlexW2963172624WikidataQ115602254 ScholiaQ115602254MaRDI QIDQ1989902

Qingcan Wang, E. Weinan

Publication date: 29 October 2018

Published in: Science China. Mathematics (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1807.00297




Related Items (36)

Structure probing neural network deflationNeural network approximationAnalytic continuation of noisy data using Adams Bashforth residual neural networkSelectNet: self-paced learning for high-dimensional partial differential equationsReLU deep neural networks from the hierarchical basis perspectiveDeep ReLU Networks Overcome the Curse of Dimensionality for Generalized Bandlimited FunctionsFull error analysis for the training of deep neural networksSparse approximation of triangular transports. I: The finite-dimensional caseA note on the applications of one primary function in deep neural networksNonlinear approximation via compositionsSimultaneous neural network approximation for smooth functionsDeep Neural Networks for Solving Large Linear Systems Arising from High-Dimensional ProblemsOn the approximation of functions by tanh neural networksDeep ReLU neural network approximation in Bochner spaces and applications to parametric PDEsConvergence of deep convolutional neural networksDeep ReLU neural networks in high-dimensional approximationOverall error analysis for the training of deep neural networks via stochastic gradient descent with random initialisationDeep Neural Networks with ReLU-Sine-Exponential Activations Break Curse of Dimensionality in Approximation on Hölder ClassCollocation approximation by deep neural ReLU networks for parametric and stochastic PDEs with lognormal inputsImproved training of physics-informed neural networks for parabolic differential equations with sharply perturbed initial conditionsOn mathematical modeling in image reconstruction and beyondComputation and learning in high dimensions. Abstracts from the workshop held August 1--7, 2021 (hybrid meeting)Deep ReLU networks and high-order finite element methodsRectified deep neural networks overcome the curse of dimensionality for nonsmooth value functions in zero-sum games of nonlinear stiff systemsPDE-Net 2.0: learning PDEs from data with a numeric-symbolic hybrid deep networkDeep Network Approximation for Smooth FunctionsBetter Approximations of High Dimensional Smooth Functions by Deep Neural Networks with Rectified Power UnitsSelection dynamics for deep neural networksOptimal approximation rate of ReLU networks in terms of width and depthConstructive deep ReLU neural network approximationMgNet: a unified framework of multigrid and convolutional neural networkThe Gap between Theory and Practice in Function Approximation with Deep Neural NetworksDeep Network With Approximation Error Being Reciprocal of Width to Power of Square Root of DepthNon-homogeneous Poisson process intensity modeling and estimation using measure transportNonlinear approximation and (deep) ReLU networksExponential ReLU DNN expression of holomorphic maps in high dimension



Cites Work


This page was built for publication: Exponential convergence of the deep neural network approximation for analytic functions