A lower bound on the error in nonparametric regression type problems
From MaRDI portal
Recommendations
Cited in
(18)- Convergence rates for kernel regression in infinite-dimensional spaces
- \(L_ 1\)-optimal estimates for a regression type function in \(R^ d\)
- On consistent statistical procedures in regression
- Some theoretical results on neural spike train probability models
- Dependence and the dimensionality reduction principle
- A general lower bound of minimax risk for absolute‐error loss
- Optimal global rate of convergence in nonparametric regression with left-truncated and right-censored data.
- Global nonparametric estimation of conditional quantile functions and their derivatives
- Kernel estimation of discontinuous regression functions
- scientific article; zbMATH DE number 7370563 (Why is no real title available?)
- Upper bounds for errors of estimators in a problem of nonparametric regression: the adaptive case and the case of unknown measure \(\rho _X\)
- Nonparametric matrix regression function estimation over symmetric positive definite matrices
- Directional mixture models and optimal estimation of the mixing density
- Information-theoretic determination of minimax rates of convergence
- scientific article; zbMATH DE number 4056794 (Why is no real title available?)
- Optimal spherical deconvolution
- Logspline density estimation for binned data
- Rates of convergence of estimates, Kolmogorov's entropy and the dimensionality reduction principle in regression
This page was built for publication: A lower bound on the error in nonparametric regression type problems
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1106583)