Size and depth of monotone neural networks: interpolation and approximation

From MaRDI portal
Publication:6404672

arXiv2207.05275MaRDI QIDQ6404672FDOQ6404672


Authors: Dan Mikulincer, Daniel Reichman Edit this on Wikidata


Publication date: 11 July 2022

Abstract: Monotone functions and data sets arise in a variety of applications. We study the interpolation problem for monotone data sets: The input is a monotone data set with n points, and the goal is to find a size and depth efficient monotone neural network, with non negative parameters and threshold units, that interpolates the data set. We show that there are monotone data sets that cannot be interpolated by a monotone network of depth 2. On the other hand, we prove that for every monotone data set with n points in mathbbRd, there exists an interpolating monotone network of depth 4 and size O(nd). Our interpolation result implies that every monotone function over [0,1]d can be approximated arbitrarily well by a depth-4 monotone network, improving the previous best-known construction of depth d+1. Finally, building on results from Boolean circuit complexity, we show that the inductive bias of having positive parameters can lead to a super-polynomial blow-up in the number of neurons when approximating monotone functions.




Has companion code repository: https://github.com/danmiku/monotonenetworks









This page was built for publication: Size and depth of monotone neural networks: interpolation and approximation

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6404672)