A lower bound on the error in nonparametric regression type problems
From MaRDI portal
(Redirected from Publication:1106583)
Recommendations
Cited in
(18)- Rates of convergence of estimates, Kolmogorov's entropy and the dimensionality reduction principle in regression
- scientific article; zbMATH DE number 7370563 (Why is no real title available?)
- Upper bounds for errors of estimators in a problem of nonparametric regression: the adaptive case and the case of unknown measure \(\rho _X\)
- Optimal global rate of convergence in nonparametric regression with left-truncated and right-censored data.
- Nonparametric matrix regression function estimation over symmetric positive definite matrices
- scientific article; zbMATH DE number 4056794 (Why is no real title available?)
- Dependence and the dimensionality reduction principle
- Global nonparametric estimation of conditional quantile functions and their derivatives
- Some theoretical results on neural spike train probability models
- Kernel estimation of discontinuous regression functions
- Directional mixture models and optimal estimation of the mixing density
- \(L_ 1\)-optimal estimates for a regression type function in \(R^ d\)
- Information-theoretic determination of minimax rates of convergence
- On consistent statistical procedures in regression
- Optimal spherical deconvolution
- Logspline density estimation for binned data
- A general lower bound of minimax risk for absolute‐error loss
- Convergence rates for kernel regression in infinite-dimensional spaces
This page was built for publication: A lower bound on the error in nonparametric regression type problems
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1106583)