Estimation and selection procedures in regression: anL1approach
From MaRDI portal
Publication:4546737
Recommendations
- Upper bounds for the \(L_ 1\)-risk of the minimum \(L_ 1\)-distance regression estimator
- Nonparametric estimation of a regression function
- Estimating a regression function
- Asymptotics of the “minimumL 1-norm” estimates in nonparametric regression models
- Regression function estimation as a partly inverse problem
Cites work
- scientific article; zbMATH DE number 739534 (Why is no real title available?)
- A universally acceptable smoothing factor for kernel density estimates
- Convergence of stochastic processes
- Estimating a regression function
- Inequalities for the $r$th Absolute Moment of a Sum of Random Variables, $1 \leqq r \leqq 2$
- Local linear regression smoothers and their minimax efficiencies
- Model selection in nonparametric regression
- Nonasymptotic universal smoothing factors, kernel complexity and Yatracos classes
- Rates of convergence of estimates, Kolmogorov's entropy and the dimensionality reduction principle in regression
- Rates of convergence of minimum distance estimators and Kolmogorov's entropy
- Universal consistency of local polynomial kernel regression estimates
- Weak convergence and empirical processes. With applications to statistics
Cited in
(8)- A universal procedure for aggregating estimators
- A note on penalized minimum distance estimation in nonparametric regression
- Model selection in nonparametric regression
- Clarification: Regression Model Selection—A Residual Likelihood Approach
- A note on minimum distance estimation of copula densities
- On finite-sample properties of adaptive least squares regression estimates
- APPLIED REGRESSION ANALYSIS BIBLIOGRAPHY UPDATE 2000–2001
- Aggregating estimates by convex optimization
This page was built for publication: Estimation and selection procedures in regression: anL1approach
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4546737)