Universality of deep convolutional neural networks

From MaRDI portal
Revision as of 13:24, 2 February 2024 by Import240129110113 (talk | contribs) (Created automatically from import240129110113)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:2300759

DOI10.1016/J.ACHA.2019.06.004zbMath1434.68531arXiv1805.10769OpenAlexW2963626582MaRDI QIDQ2300759

Yanyan Li

Publication date: 28 February 2020

Published in: Applied and Computational Harmonic Analysis (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1805.10769




Related Items (79)

Training a Neural-Network-Based Surrogate Model for Aerodynamic Optimisation Using a Gaussian ProcessEffects of depth, width, and initialization: A convergence analysis of layer-wise training for deep linear neural networksGeneralization Error Analysis of Neural Networks with Gradient Based RegularizationLearning time-dependent PDEs with a linear and nonlinear separate convolutional neural networkDENSITY RESULTS BY DEEP NEURAL NETWORK OPERATORS WITH INTEGER WEIGHTSMoreau envelope augmented Lagrangian method for nonconvex optimization with linear constraintsApproximation properties of deep ReLU CNNsUnnamed ItemRates of convergence of randomized Kaczmarz algorithms in Hilbert spacesNeural network interpolation operators activated by smooth ramp functionsWeighted random sampling and reconstruction in general multivariate trigonometric polynomial spacesDistributed semi-supervised regression learning with coefficient regularizationA note on the applications of one primary function in deep neural networksTheory of deep convolutional neural networks: downsamplingDeep Neural Networks and PIDE DiscretizationsQuantitative estimates for neural network operators implied by the asymptotic behaviour of the sigmoidal activation functionsA deep network construction that adapts to intrinsic dimensionality beyond the domainTheory of deep convolutional neural networks. III: Approximating radial functionsNeural network approximation of continuous functions in high dimensions with applications to inverse problemsApproximating smooth and sparse functions by deep neural networks: optimal approximation rates and saturationRates of approximation by ReLU shallow neural networksDeep learning methods for partial differential equations and related parameter identification problemsUniversality of gradient descent neural network trainingDNN-based speech watermarking resistant to desynchronization attacksNeural network interpolation operators optimized by Lagrange polynomialConvergence of deep convolutional neural networksProbabilistic robustness estimates for feed-forward neural networksApproximation Analysis of Convolutional Neural NetworksApproximation error for neural network operators by an averaged modulus of smoothnessLearning rates for the kernel regularized regression with a differentiable strongly convex lossApproximation by multivariate max-product Kantorovich-type operators and learning rates of least-squares regularized regressionUniversal regular conditional distributions via probabilistic transformersOn the K-functional in learning theoryLearning Optimal Multigrid Smoothers via Neural NetworksScaling Up Bayesian Uncertainty Quantification for Inverse Problems Using Deep Neural NetworksLearning sparse and smooth functions by deep sigmoid netsError analysis of kernel regularized pairwise learning with a strongly convex lossSignReLU neural network and its approximation abilityNeural network interpolation operators of multivariate functionsSelf-Supervised Deep Learning for Image Reconstruction: A Langevin Monte Carlo ApproachConnections between Operator-Splitting Methods and Deep Neural Networks with Applications in Image SegmentationError bounds for approximations using multichannel deep convolutional neural networks with downsamplingDeep learning theory of distribution regression with CNNsApproximation of nonlinear functionals using deep ReLU networksThe universal approximation theorem for complex-valued neural networksLearning ability of interpolating deep convolutional neural networksLu decomposition and Toeplitz decomposition of a neural networkDeep learning for inverse problems with unknown operatorTwo-layer networks with the \(\text{ReLU}^k\) activation function: Barron spaces and derivative approximationQuadratic Neural Networks for Solving Inverse ProblemsComputation and learning in high dimensions. Abstracts from the workshop held August 1--7, 2021 (hybrid meeting)Approximation bounds for convolutional neural networks in operator learningDeep Network Approximation for Smooth FunctionsUnnamed ItemButterfly-Net: Optimal Function Representation Based on Convolutional Neural NetworksApproximation of smooth functionals using deep ReLU networksLearning with centered reproducing kernelsApproximation analysis of CNNs from a feature extraction viewDistributed regularized least squares with flexible Gaussian kernelsOnline regularized pairwise learning with least squares lossTheory of deep convolutional neural networks. II: Spherical analysisRates of approximation by neural network interpolation operatorsStochastic Markov gradient descent and training low-bit neural networksDeep neural networks for rotation-invariance approximation and learningLearning under \((1 + \epsilon)\)-moment conditionsNeural ODEs as the deep limit of ResNets with constant weightsBalanced joint maximum mean discrepancy for deep transfer learningThe construction and approximation of ReLU neural network operatorsOn the rate of convergence of image classifiers based on convolutional neural networksDeep Network With Approximation Error Being Reciprocal of Width to Power of Square Root of DepthConvolutional spectral kernel learning with generalization guaranteesDistributed Filtered Hyperinterpolation for Noisy Data on the SphereLearning rates for partially linear support vector machine in high dimensionsApproximation of functions from korobov spaces by deep convolutional neural networksError bounds for ReLU networks with depth and width parametersLearnable Empirical Mode Decomposition based on Mathematical MorphologyAnalysis of convolutional neural network image classifiers in a hierarchical max-pooling model with additional local poolingApproximating functions with multi-features by deep convolutional neural networksSpline representation and redundancies of one-dimensional ReLU neural network models


Uses Software



Cites Work




This page was built for publication: Universality of deep convolutional neural networks