Nonparametric regression using deep neural networks with ReLU activation function
DOI10.1214/19-AOS1875zbMath1459.62059arXiv1708.06633OpenAlexW3102511045MaRDI QIDQ2215715
Publication date: 14 December 2020
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1708.06633
waveletsadditive modelsnonparametric regressionmultilayer neural networksrectified linear unit (ReLU)minimax estimation riskrectified linear activation function
Nonparametric regression and quantile regression (62G08) Nontrigonometric harmonic analysis involving wavelets and other special systems (42C40) Artificial neural networks and deep learning (68T07) Minimax procedures in statistical decision theory (62C20) Neural nets and related approaches to inference from stochastic processes (62M45)
Related Items (88)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Concentration inequalities and model selection. Ecole d'Eté de Probabilités de Saint-Flour XXXIII -- 2003.
- Nonparametric estimation of composite functions
- Approximation and estimation bounds for artificial neural networks
- Wavelets on the interval and fast wavelet transforms
- A regularity class for the roots of nonnegative functions
- Multilayer feedforward networks are universal approximators
- A distribution-free theory of nonparametric regression
- Approximation properties of a multilayered feedforward artificial neural network
- Error bounds for approximations with deep ReLU networks
- On deep learning as a remedy for the curse of dimensionality in nonparametric regression
- Local Rademacher complexities and oracle inequalities in risk minimization. (2004 IMS Medallion Lecture). (With discussions and rejoinder)
- Estimating composite functions by model selection
- Rate-optimal estimation for a general class of nonparametric regression models with unknown link functions
- Concentration inequalities and asymptotic results for ratio type empirical processes
- Nonasymptotic bounds on the \(L_{2}\) error of neural network regression estimates
- Deep vs. shallow networks: An approximation theory perspective
- Adaptive regression estimation with multilayer feedforward neural networks
- Universal approximation bounds for superpositions of a sigmoidal function
- Neural Network Learning
- Optimal Approximation with Sparsely Connected Deep Neural Networks
- Nonparametric Regression Based on Hierarchical Interaction Models
- Breaking the Curse of Dimensionality with Convex Neural Networks
- Universal Approximation Depth and Errors of Narrow Belief Networks with Discrete Units
- All of Nonparametric Statistics
- Introduction to nonparametric estimation
- Approximation by superpositions of a sigmoidal function
This page was built for publication: Nonparametric regression using deep neural networks with ReLU activation function