New aspects of Bregman divergence in regression and classification with parametric and nonparametric estimation
asymptotic normalityconsistencyprediction errorloss functionlocal polynomial regressionBayes optimal rule
Asymptotic properties of parametric estimators (62F12) Nonparametric regression and quantile regression (62G08) Asymptotic properties of nonparametric inference (62G20) Classification and discrimination; cluster analysis (statistical aspects) (62H30) General nonlinear regression (62J02) Asymptotic distribution theory in statistics (62E20) Bayesian problems; characterization of Bayes procedures (62C10)
- Penalized Bregman divergence for large-dimensional regression and classification
- Prediction Error Estimation Under Bregman Divergence for Non‐Parametric Regression and Classification
- Statistical inference of minimum BD estimators and classifiers for varying-dimensional models
- Functional Bregman Divergence and Bayesian Estimation of Distributions
- Regression using localised functional Bregman divergence
- scientific article; zbMATH DE number 3458075 (Why is no real title available?)
- scientific article; zbMATH DE number 472973 (Why is no real title available?)
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- A decision-theoretic generalization of on-line learning and an application to boosting
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Arcing classifiers. (With discussion)
- Game theory, maximum entropy, minimum discrepancy and robust Bayesian decision theory
- How Biased is the Apparent Error Rate of a Prediction Rule?
- Local Likelihood Estimation
- New aspects of Bregman divergence in regression and classification with parametric and nonparametric estimation
- On the Optimality of Conditional Expectation as a Bregman Predictor
- Relative loss bounds for on-line density estimation with the exponential family of distributions
- The Estimation of Prediction Error
- The boosting approach to machine learning: an overview
- The elements of statistical learning. Data mining, inference, and prediction
- Regression with stagewise minimization on risk function
- New aspects of Bregman divergence in regression and classification with parametric and nonparametric estimation
- Statistical inference of minimum BD estimators and classifiers for varying-dimensional models
- Kernel density estimation by stagewise algorithm with a simple dictionary
- Locally robust methods and near-parametric asymptotics
- Joint estimation and variable selection for mean and dispersion in proper dispersion models
- Screening-based Bregman divergence estimation with NP-dimensionality
- Asymptotic theory for local estimators based on Bregman divergence
- Robust estimation in regression and classification methods for large dimensional data
- Estimation and variable selection on sparse model with group structure
- Regression using localised functional Bregman divergence
- Kernel density estimation by genetic algorithm
- Prediction Error Estimation Under Bregman Divergence for Non‐Parametric Regression and Classification
- Penalized Bregman divergence for large-dimensional regression and classification
- Hyperlink regression via Bregman divergence
- Nonparametric screening under conditional strictly convex loss for ultrahigh dimensional sparse data
This page was built for publication: New aspects of Bregman divergence in regression and classification with parametric and nonparametric estimation
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3636244)