Statistical behavior and consistency of classification methods based on convex risk minimization.

From MaRDI portal
Publication:1884603


DOI10.1214/aos/1079120130zbMath1105.62323MaRDI QIDQ1884603

Tong Zhang

Publication date: 5 November 2004

Published in: The Annals of Statistics (Search for Journal in Brave)

Full work available at URL: https://projecteuclid.org/euclid.aos/1079120130


62H30: Classification and discrimination; cluster analysis (statistical aspects)

62F15: Bayesian inference


Related Items

SVM Soft Margin Classifiers: Linear Programming versus Quadratic Programming, Selection of Binary Variables and Classification by Boosting, PAC-Bayesian bounds for randomized empirical risk minimizers, Multi-kernel regularized classifiers, Parzen windows for multi-class classification, Learning rates for regularized classifiers using multivariate polynomial kernels, Learning from dependent observations, Fast rates for support vector machines using Gaussian kernels, New multicategory boosting algorithms based on multicategory Fisher-consistent losses, Robust learning from bites for data mining, On surrogate loss functions and \(f\)-divergences, Convergence rates of generalization errors for margin-based classification, A note on margin-based loss functions in classification, On the Bayes-risk consistency of regularized boosting methods., Local Rademacher complexities and oracle inequalities in risk minimization. (2004 IMS Medallion Lecture). (With discussions and rejoinder), Fully online classification by regularization, Statistical performance of support vector machines, Ranking and empirical minimization of \(U\)-statistics, Recursive aggregation of estimators by the mirror descent algorithm with averaging, Convergence analysis of online algorithms, Simultaneous adaptation to the margin and to complexity in classification, Optimal rates of aggregation in classification under low noise assumption, On the rate of convergence for multi-category classification based on convex losses, Approximation with polynomial kernels and SVM classifiers, Learning rates of gradient descent algorithm for classification, Complexities of convex combinations and bounding the generalization error in classification, Boosting with early stopping: convergence and consistency, Classifiers of support vector machine type with \(\ell_1\) complexity regularization, Theory of Classification: a Survey of Some Recent Advances, SVM LEARNING AND Lp APPROXIMATION BY GAUSSIANS ON RIEMANNIAN MANIFOLDS, Multiclass Boosting Algorithms for Shrinkage Estimators of Class Probability, Generalization Bounds for Some Ordinal Regression Algorithms, Nonparametric Modeling of Neural Point Processes via Stochastic Gradient Boosting Regression


Uses Software


Cites Work