Sharp estimation in sup norm with random design
From MaRDI portal
Publication:997249
DOI10.1016/J.SPL.2006.11.017zbMATH Open1114.62046arXivmath/0509634OpenAlexW2046144164MaRDI QIDQ997249FDOQ997249
Publication date: 23 July 2007
Published in: Statistics \& Probability Letters (Search for Journal in Brave)
Abstract: The aim of this paper is to recover the regression function with sup norm loss. We construct an asymptotically sharp estimator which converges with the spatially dependent rate r_{n, mu}(x) = P �ig(log n / (n mu(x)) �ig)^{s / (2s + 1)}, where is the design density, the regression smoothness, the sample size and is a constant expressed in terms of a solution to a problem of optimal recovery as in Donoho (1994). We prove this result under the assumption that is positive and continuous. This estimator combines kernel and local polynomial methods, where the kernel is given by optimal recovery, which allows to prove the result up to the constants for any . Moreover, the estimator does not depend on . We prove that is optimal in a sense which is stronger than the classical minimax lower bound. Then, an inhomogeneous confidence band is proposed. This band has a non constant length which depends on the local amount of data.
Full work available at URL: https://arxiv.org/abs/math/0509634
Nonparametric estimation (62G05) Nonparametric regression and quantile regression (62G08) Asymptotic properties of nonparametric inference (62G20)
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Optimal global rates of convergence for nonparametric regression
- Asymptotic equivalence of nonparametric regression and white noise
- Asymptotic equivalence theory for nonparametric regression with random design
- Optimal spatial adaptation to inhomogeneous smoothness: An approach based on kernel estimates with variable bandwidth selectors
- Optimal recovery of operators, and related problems
- Asymptotic minimax risk for sup-norm loss: Solution via optimal recovery
- An Asymptotically Minimax Regression Estimator in the Uniform Norm up to Exact Constant
- Asymptotically exact minimax estimation in sup-norm for anisotropic Hölder classes
- The asymptotic minimax constant for sup-norm loss in nonparametric density estimation
- On the solution of an optimal recovery problem and its applications in nonparametric regression
- Remarks on extremal problems in nonparametric curve estimation
- Asymptotically exact nonparametric hypothesis testing in sup-norm and at a fixed point
- Renormalization exponents and optimal pointwise rates of convergence
- Minimax exact constant in sup-norm for nonparametric regression with random design
Cited In (6)
- A sup-norm oracle inequality for a partially linear regression model
- Spatially inhomogeneous linear inverse problems with possible singularities
- A note on wavelet estimation of the derivatives of a regression function in a random design setting
- Asymptotic equivalence for nonparametric regression with multivariate and random design
- A robust, adaptive M-estimator for pointwise estimation in heteroscedastic regression
- Classification via local multi-resolution projections
Recommendations
- Sharp bounds on the variance in randomized experiments 👍 👎
- Sharp estimates of deviations of the sample mean in many dimensions 👍 👎
- Sharp large deviations in nonparametric estimation 👍 👎
- Sharp bounds on l-estimates and their expectations for dependent samples 👍 👎
- SHRINKAGE ESTIMATION FOR NEARLY SINGULAR DESIGNS 👍 👎
- Minimax exact constant in sup-norm for nonparametric regression with random design 👍 👎
- Sharp estimate on the supremum of a class of sums of small i.i.d. random variables 👍 👎
- A sharp estimate for probability distributions 👍 👎
- On sharp nonparametric estimation of differentiable functions 👍 👎
This page was built for publication: Sharp estimation in sup norm with random design
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q997249)