Error bounds for deep ReLU networks using the Kolmogorov-Arnold superposition theorem

From MaRDI portal
Publication:2055036

DOI10.1016/j.neunet.2019.12.013OpenAlexW3031010353WikidataQ96028494 ScholiaQ96028494MaRDI QIDQ2055036

Hadrien Montanelli, Haizhao Yang

Publication date: 3 December 2021

Published in: Neural Networks (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1906.11945




Related Items (23)

Structure probing neural network deflationStationary Density Estimation of Itô Diffusions Using Deep LearningMachine learning for prediction with missing dynamicsSelectNet: self-paced learning for high-dimensional partial differential equationsDeep ReLU Networks Overcome the Curse of Dimensionality for Generalized Bandlimited FunctionsThe Discovery of Dynamics via Linear Multistep Methods and Deep Learning: Error EstimationSimultaneous neural network approximation for smooth functionsNeural network approximation: three hidden layers are enoughOn the approximation of functions by tanh neural networksA note on computing with Kolmogorov superpositions without iterationsConvergence of deep convolutional neural networksThe Kolmogorov-Arnold representation theorem revisitedFriedrichs Learning: Weak Solutions of Partial Differential Equations via Deep LearningDeep Neural Networks with ReLU-Sine-Exponential Activations Break Curse of Dimensionality in Approximation on Hölder ClassDeep nonparametric estimation of intrinsic data structures by chart autoencoders: generalization error and robustnessApproximation of compositional functions with ReLU neural networksGrowing axons: greedy learning of neural networks with application to function approximationA three layer neural network can represent any multivariate functionConvergence rate of DeepONets for learning operators arising from advection-diffusion equationsExSpliNet: An interpretable and expressive spline-based neural networkDeep Network Approximation for Smooth FunctionsDeep Network Approximation Characterized by Number of NeuronsDeep Network With Approximation Error Being Reciprocal of Width to Power of Square Root of Depth



Cites Work


This page was built for publication: Error bounds for deep ReLU networks using the Kolmogorov-Arnold superposition theorem