Approximation of smoothness classes by deep rectifier networks

From MaRDI portal
Publication:5020751

DOI10.1137/20M1360657zbMATH Open1494.41008arXiv2007.15645MaRDI QIDQ5020751FDOQ5020751

Mazen Ali, A. Nouy

Publication date: 7 January 2022

Published in: SIAM Journal on Numerical Analysis (Search for Journal in Brave)

Abstract: We consider approximation rates of sparsely connected deep rectified linear unit (ReLU) and rectified power unit (RePU) neural networks for functions in Besov spaces Balphaq(Lp) in arbitrary dimension d, on general domains. We show that alert{deep rectifier} networks with a fixed activation function attain optimal or near to optimal approximation rates for functions in the Besov space Balphaau(Lau) on the critical embedding line 1/au=alpha/d+1/p for emph{arbitrary} smoothness order alpha>0. Using interpolation theory, this implies that the entire range of smoothness classes at or above the critical line is (near to) optimally approximated by deep ReLU/RePU networks.


Full work available at URL: https://arxiv.org/abs/2007.15645




Recommendations




Cites Work


Cited In (5)





This page was built for publication: Approximation of smoothness classes by deep rectifier networks

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5020751)