Approximation spaces of deep neural networks

From MaRDI portal
Publication:2117336

DOI10.1007/s00365-021-09543-4zbMath1491.82017arXiv1905.01208OpenAlexW2943191253MaRDI QIDQ2117336

Felix Voigtlaender, Gitta Kutyniok, Rémi Gribonval, Morten Nielsen

Publication date: 21 March 2022

Published in: Constructive Approximation (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1905.01208



Related Items

Neural network approximation, Learning with tree tensor networks: complexity estimates and model selection, Designing rotationally invariant neural networks from PDEs and variational methods, Training thinner and deeper neural networks: jumpstart regularization, Full error analysis for the training of deep neural networks, Simultaneous neural network approximation for smooth functions, Deep ReLU neural network approximation in Bochner spaces and applications to parametric PDEs, Deep learning methods for partial differential equations and related parameter identification problems, Approximation theory of tree tensor networks: tensorized univariate functions, Mesh-informed neural networks for operator learning in finite element spaces, Deep ReLU neural networks in high-dimensional approximation, Towards Lower Bounds on the Depth of ReLU Neural Networks, Universal regular conditional distributions via probabilistic transformers, Deep Learning in High Dimension: Neural Network Expression Rates for Analytic Functions in \(\pmb{L^2(\mathbb{R}^d,\gamma_d)}\), Deep Neural Networks with ReLU-Sine-Exponential Activations Break Curse of Dimensionality in Approximation on Hölder Class, A multivariate Riesz basis of ReLU neural networks, Collocation approximation by deep neural ReLU networks for parametric and stochastic PDEs with lognormal inputs, Revisiting Convolutional Neural Networks from the Viewpoint of Kernel-Based Methods, The universal approximation theorem for complex-valued neural networks, Learning ability of interpolating deep convolutional neural networks, Two-layer networks with the \(\text{ReLU}^k\) activation function: Barron spaces and derivative approximation, Approximation properties of residual neural networks for Kolmogorov PDEs, Sobolev-type embeddings for neural network approximation spaces, Applied harmonic analysis and data processing. Abstracts from the workshop held March 25--31, 2018, A New Function Space from Barron Class and Application to Neural Network Approximation


Uses Software


Cites Work