Function approximation by deep neural networks with parameters \(\{0, \pm \frac{1}{2}, \pm 1,2\}\)
From MaRDI portal
Publication:2074650
DOI10.1007/s42519-021-00229-5zbMath1478.62089arXiv2103.08659OpenAlexW3137945565MaRDI QIDQ2074650
Publication date: 10 February 2022
Published in: Journal of Statistical Theory and Practice (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2103.08659
Nonparametric regression and quantile regression (62G08) Nonparametric estimation (62G05) Artificial neural networks and deep learning (68T07) General theory of stochastic processes (60G07) Approximation by arbitrary nonlinear expressions; widths and entropy (41A46)
Related Items (1)
Cites Work
- Estimating a density under order restrictions: Nonasymptotic minimax risk
- A distribution-free theory of nonparametric regression
- Nonparametric regression using deep neural networks with ReLU activation function
- Error bounds for approximations with deep ReLU networks
- Convergence of stochastic processes
- Statistical guarantees for regularized neural networks
This page was built for publication: Function approximation by deep neural networks with parameters \(\{0, \pm \frac{1}{2}, \pm 1,2\}\)