New aspects of Bregman divergence in regression and classification with parametric and nonparametric estimation
DOI10.1002/cjs.10005zbMath1170.62037OpenAlexW2161960669MaRDI QIDQ3636244
Chunming Zhang, Yuan Jiang, Zuofeng Shang
Publication date: 30 June 2009
Published in: Canadian Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1002/cjs.10005
consistencyasymptotic normalityprediction errorlocal polynomial regressionloss functionBayes optimal rule
Asymptotic properties of parametric estimators (62F12) Nonparametric regression and quantile regression (62G08) Asymptotic distribution theory in statistics (62E20) Asymptotic properties of nonparametric inference (62G20) Classification and discrimination; cluster analysis (statistical aspects) (62H30) Bayesian problems; characterization of Bayes procedures (62C10) General nonlinear regression (62J02)
Related Items (11)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A decision-theoretic generalization of on-line learning and an application to boosting
- Arcing classifiers. (With discussion)
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Game theory, maximum entropy, minimum discrepancy and robust Bayesian decision theory
- Local Likelihood Estimation
- On the Optimality of Conditional Expectation as a Bregman Predictor
- New aspects of Bregman divergence in regression and classification with parametric and nonparametric estimation
- How Biased is the Apparent Error Rate of a Prediction Rule?
- The Estimation of Prediction Error
- The elements of statistical learning. Data mining, inference, and prediction
- Relative loss bounds for on-line density estimation with the exponential family of distributions
This page was built for publication: New aspects of Bregman divergence in regression and classification with parametric and nonparametric estimation