ReLU Deep Neural Networks from the Hierarchical Basis Perspective

From MaRDI portal
Publication:6367236

DOI10.1016/J.CAMWA.2022.06.006arXiv2105.04156MaRDI QIDQ6367236FDOQ6367236


Authors: Juncai He, Lin Li, Jinchao Xu Edit this on Wikidata


Publication date: 10 May 2021

Abstract: We study ReLU deep neural networks (DNNs) by investigating their connections with the hierarchical basis method in finite element methods. First, we show that the approximation schemes of ReLU DNNs for x2 and xy are composition versions of the hierarchical basis approximation for these two functions. Based on this fact, we obtain a geometric interpretation and systematic proof for the approximation result of ReLU DNNs for polynomials, which plays an important role in a series of recent exponential approximation results of ReLU DNNs. Through our investigation of connections between ReLU DNNs and the hierarchical basis approximation for x2 and xy, we show that ReLU DNNs with this special structure can be applied only to approximate quadratic functions. Furthermore, we obtain a concise representation to explicitly reproduce any linear finite element function on a two-dimensional uniform mesh by using ReLU DNNs with only two hidden layers.













This page was built for publication: ReLU Deep Neural Networks from the Hierarchical Basis Perspective

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6367236)