Neural network with unbounded activation functions is universal approximator
From MaRDI portal
Publication:2399647
DOI10.1016/j.acha.2015.12.005zbMath1420.68177arXiv1505.03654OpenAlexW3101806332MaRDI QIDQ2399647
Publication date: 24 August 2017
Published in: Applied and Computational Harmonic Analysis (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1505.03654
neural networkRadon transformintegral representationuniversal approximationadmissibility conditionridgelet transformrectified linear unit (ReLU)backprojection filterbounded extension to \(L^2\)Lizorkin distribution
Related Items (35)
Machine learning from a continuous viewpoint. I ⋮ Nonconvex regularization for sparse neural networks ⋮ Neural dynamic sliding mode control of nonlinear systems with both matched and mismatched uncertainties ⋮ On the minimax optimality and superiority of deep neural network learning over sparse parameter spaces ⋮ Piecewise linear functions representable with infinite width shallow ReLU neural networks ⋮ Theory of deep convolutional neural networks. III: Approximating radial functions ⋮ Universal approximation properties for an ODENet and a ResNet: mathematical analysis and numerical experiments ⋮ A survey on modern trainable activation functions ⋮ Continuity properties of the shearlet transform and the shearlet synthesis operator on the Lizorkin type spaces ⋮ Deep reinforcement learning for adaptive mesh refinement ⋮ The role of nonpolynomiality in uniform approximation by RBF networks of Hankel translates ⋮ Explicit representations for Banach subspaces of Lizorkin distributions ⋮ Rapid estimation of permeability from digital rock using 3D convolutional neural network ⋮ Heaviside function as an activation function ⋮ Unnamed Item ⋮ Beating a Benchmark: Dynamic Programming May Not Be the Right Numerical Approach ⋮ Hilbert C∗-Module for Analyzing Structured Data ⋮ Unnamed Item ⋮ Unnamed Item ⋮ Center Manifold Analysis of Plateau Phenomena Caused by Degeneration of Three-Layer Perceptron ⋮ An Interpretive Constrained Linear Model for ResNet and MgNet ⋮ A global universality of two-layer neural networks with ReLU activations ⋮ Misspecified diffusion models with high-frequency observations and an application to neural networks ⋮ Estimation of agent-based models using Bayesian deep learning approach of BayesFlow ⋮ A deep learning semiparametric regression for adjusting complex confounding structures ⋮ Regression methods in waveform modeling: a comparative study ⋮ Geometric deep learning for computational mechanics. I: Anisotropic hyperelasticity ⋮ On the double windowed ridgelet transform and its inverse ⋮ Deep learning as optimal control problems: models and numerical methods ⋮ Fast generalization error bound of deep learning without scale invariance of activation functions ⋮ Theory of deep convolutional neural networks. II: Spherical analysis ⋮ A mean-field optimal control formulation of deep learning ⋮ Symmetry \& critical points for a model shallow neural network ⋮ Unnamed Item ⋮ Unnamed Item
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Morrey and Campanato meet Besov, Lizorkin and Triebel
- Functional analysis, Sobolev spaces and partial differential equations
- A simple lemma on greedy approximation in Hilbert space and convergence rates for projection pursuit regression and neural network training
- Approximation by superposition of sigmoidal and radial basis functions
- The Calderón reproducing formula, windowed \(X\)-ray transforms, and Radon transforms in \(L^p\)-spaces
- Harmonic analysis of neural networks
- Convolution-backprojection method for the \(k\)-plane transform, and Calderón's identity for ridgelet transforms
- Continuity of the Radon transform and its inverse on Euclidean space
- Complexity estimates based on integral transforms induced by computational units
- A Sobolev-type upper bound for rates of approximation by linear combinations of Heaviside plane waves
- Integral Geometry and Radon Transforms
- The Ridgelet Transform and Quasiasymptotic Behavior of Distributions
- Classical Fourier Analysis
- Tight frames of k -plane ridgelets and the problem of representing objects that are smooth away from d -dimensional singularities in R n
- Universal approximation bounds for superpositions of a sigmoidal function
- A birth and death model of neuron firing
- The Ridgelet transform of distributions
- Sparse Image and Signal Processing
- Ridge functions and orthonormal ridgelets
This page was built for publication: Neural network with unbounded activation functions is universal approximator