Approximation spaces of deep neural networks

From MaRDI portal
Revision as of 22:42, 1 February 2024 by Import240129110113 (talk | contribs) (Created automatically from import240129110113)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:2117336

DOI10.1007/S00365-021-09543-4zbMath1491.82017arXiv1905.01208OpenAlexW2943191253MaRDI QIDQ2117336

Felix Voigtlaender, Gitta Kutyniok, Rémi Gribonval, Morten Nielsen

Publication date: 21 March 2022

Published in: Constructive Approximation (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1905.01208




Related Items (25)

Neural network approximationLearning with tree tensor networks: complexity estimates and model selectionDesigning rotationally invariant neural networks from PDEs and variational methodsTraining thinner and deeper neural networks: jumpstart regularizationFull error analysis for the training of deep neural networksSimultaneous neural network approximation for smooth functionsDeep ReLU neural network approximation in Bochner spaces and applications to parametric PDEsDeep learning methods for partial differential equations and related parameter identification problemsApproximation theory of tree tensor networks: tensorized univariate functionsMesh-informed neural networks for operator learning in finite element spacesDeep ReLU neural networks in high-dimensional approximationTowards Lower Bounds on the Depth of ReLU Neural NetworksUniversal regular conditional distributions via probabilistic transformersDeep Learning in High Dimension: Neural Network Expression Rates for Analytic Functions in \(\pmb{L^2(\mathbb{R}^d,\gamma_d)}\)Deep Neural Networks with ReLU-Sine-Exponential Activations Break Curse of Dimensionality in Approximation on Hölder ClassA multivariate Riesz basis of ReLU neural networksCollocation approximation by deep neural ReLU networks for parametric and stochastic PDEs with lognormal inputsRevisiting Convolutional Neural Networks from the Viewpoint of Kernel-Based MethodsThe universal approximation theorem for complex-valued neural networksLearning ability of interpolating deep convolutional neural networksTwo-layer networks with the \(\text{ReLU}^k\) activation function: Barron spaces and derivative approximationApproximation properties of residual neural networks for Kolmogorov PDEsSobolev-type embeddings for neural network approximation spacesApplied harmonic analysis and data processing. Abstracts from the workshop held March 25--31, 2018A New Function Space from Barron Class and Application to Neural Network Approximation


Uses Software



Cites Work




This page was built for publication: Approximation spaces of deep neural networks