Asymptotic properties of neural network sieve estimators
From MaRDI portal
Publication:6091912
DOI10.1080/10485252.2023.2209218arXiv1906.00875MaRDI QIDQ6091912
Qing Lu, Chang Jiang, Lyudmila Sakhanenko, Xiaoxi Shen
Publication date: 21 November 2023
Published in: Journal of Nonparametric Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1906.00875
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Probability inequalities for empirical processes and a law of the iterated logarithm
- Approximation and estimation bounds for artificial neural networks
- Convergence rate of sieve estimates
- On methods of sieves and penalization
- Multilayer feedforward networks are universal approximators
- Optimal global rates of convergence for nonparametric regression
- A distribution-free theory of nonparametric regression
- Random approximants and neural networks
- Weak convergence and empirical processes. With applications to statistics
- On the rate of convergence of fully connected deep neural network regression estimates
- Nonparametric regression using deep neural networks with ReLU activation function
- A goodness-of-fit test based on neural network sieve estimators
- On deep learning as a remedy for the curse of dimensionality in nonparametric regression
- Sieve Extremum Estimates for Weakly Dependent Data
- Improved rates and asymptotic normality for nonparametric neural network estimators
- Deep Neural Networks for Estimation and Inference
This page was built for publication: Asymptotic properties of neural network sieve estimators