New aspects of Bregman divergence in regression and classification with parametric and nonparametric estimation
DOI10.1002/CJS.10005zbMATH Open1170.62037OpenAlexW2161960669MaRDI QIDQ3636244FDOQ3636244
Chunming Zhang, Yuan Jiang, Zuofeng Shang
Publication date: 30 June 2009
Published in: The Canadian Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1002/cjs.10005
Recommendations
- Penalized Bregman divergence for large-dimensional regression and classification
- Prediction Error Estimation Under Bregman Divergence for Non‐Parametric Regression and Classification
- Statistical inference of minimum BD estimators and classifiers for varying-dimensional models
- Functional Bregman Divergence and Bayesian Estimation of Distributions
- Regression using localised functional Bregman divergence
asymptotic normalityconsistencyprediction errorloss functionlocal polynomial regressionBayes optimal rule
Asymptotic properties of parametric estimators (62F12) Nonparametric regression and quantile regression (62G08) Asymptotic properties of nonparametric inference (62G20) Classification and discrimination; cluster analysis (statistical aspects) (62H30) General nonlinear regression (62J02) Asymptotic distribution theory in statistics (62E20) Bayesian problems; characterization of Bayes procedures (62C10)
Cites Work
- A decision-theoretic generalization of on-line learning and an application to boosting
- The elements of statistical learning. Data mining, inference, and prediction
- Title not available (Why is that?)
- Title not available (Why is that?)
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Arcing classifiers. (With discussion)
- How Biased is the Apparent Error Rate of a Prediction Rule?
- Title not available (Why is that?)
- The boosting approach to machine learning: an overview
- The Estimation of Prediction Error
- Game theory, maximum entropy, minimum discrepancy and robust Bayesian decision theory
- Local Likelihood Estimation
- On the Optimality of Conditional Expectation as a Bregman Predictor
- New aspects of Bregman divergence in regression and classification with parametric and nonparametric estimation
- Relative loss bounds for on-line density estimation with the exponential family of distributions
Cited In (14)
- Regression with stagewise minimization on risk function
- New aspects of Bregman divergence in regression and classification with parametric and nonparametric estimation
- Kernel density estimation by stagewise algorithm with a simple dictionary
- Locally robust methods and near-parametric asymptotics
- Title not available (Why is that?)
- Joint estimation and variable selection for mean and dispersion in proper dispersion models
- Screening-based Bregman divergence estimation with NP-dimensionality
- Robust estimation in regression and classification methods for large dimensional data
- Regression using localised functional Bregman divergence
- Kernel density estimation by genetic algorithm
- Prediction Error Estimation Under Bregman Divergence for Non‐Parametric Regression and Classification
- Penalized Bregman divergence for large-dimensional regression and classification
- Hyperlink regression via Bregman divergence
- Nonparametric screening under conditional strictly convex loss for ultrahigh dimensional sparse data
This page was built for publication: New aspects of Bregman divergence in regression and classification with parametric and nonparametric estimation
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3636244)