ReLU Deep Neural Networks from the Hierarchical Basis Perspective
From MaRDI portal
Publication:6367236
DOI10.1016/J.CAMWA.2022.06.006arXiv2105.04156MaRDI QIDQ6367236FDOQ6367236
Authors: Juncai He, Lin Li, Jinchao Xu
Publication date: 10 May 2021
Abstract: We study ReLU deep neural networks (DNNs) by investigating their connections with the hierarchical basis method in finite element methods. First, we show that the approximation schemes of ReLU DNNs for and are composition versions of the hierarchical basis approximation for these two functions. Based on this fact, we obtain a geometric interpretation and systematic proof for the approximation result of ReLU DNNs for polynomials, which plays an important role in a series of recent exponential approximation results of ReLU DNNs. Through our investigation of connections between ReLU DNNs and the hierarchical basis approximation for and , we show that ReLU DNNs with this special structure can be applied only to approximate quadratic functions. Furthermore, we obtain a concise representation to explicitly reproduce any linear finite element function on a two-dimensional uniform mesh by using ReLU DNNs with only two hidden layers.
Artificial neural networks and deep learning (68T07) Approximation by other special function classes (41A30) Algorithms for approximation of functions (65D15) Finite element, Rayleigh-Ritz and Galerkin methods for boundary value problems involving PDEs (65N30) Rate of convergence, degree of approximation (41A25)
This page was built for publication: ReLU Deep Neural Networks from the Hierarchical Basis Perspective
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6367236)