The following pages link to 10.1162/1532443041424319 (Q4823525):
Displayed 43 items.
- Boosting algorithms: regularization, prediction and model fitting (Q449780) (← links)
- Margin-adaptive model selection in statistical learning (Q453298) (← links)
- Learning rate of support vector machine for ranking (Q468458) (← links)
- Optimal exponential bounds on the accuracy of classification (Q485316) (← links)
- Cox process functional learning (Q500875) (← links)
- Survival ensembles by the sum of pairwise differences with application to lung cancer microarray studies (Q641157) (← links)
- Multi-kernel regularized classifiers (Q870343) (← links)
- Boosting and instability for regression trees (Q959181) (← links)
- Fast learning rates for plug-in classifiers (Q995418) (← links)
- New multicategory boosting algorithms based on multicategory Fisher-consistent losses (Q999662) (← links)
- Deformation of log-likelihood loss function for multiclass boosting (Q1784701) (← links)
- Calibrated asymmetric surrogate losses (Q1950846) (← links)
- Classification with minimax fast rates for classes of Bayes rules with sparse representation (Q1951772) (← links)
- Penalized empirical risk minimization over Besov spaces (Q1952004) (← links)
- Surrogate losses in passive and active learning (Q2008623) (← links)
- Boosted nonparametric hazards with time-dependent covariates (Q2054480) (← links)
- A precise high-dimensional asymptotic theory for boosting and minimum-\(\ell_1\)-norm interpolated classifiers (Q2148995) (← links)
- Convergence rates for empirical barycenters in metric spaces: curvature, convexity and extendable geodesics (Q2182123) (← links)
- Moving quantile regression (Q2301045) (← links)
- Learning performance of regularized moving least square regression (Q2359988) (← links)
- Local Rademacher complexities and oracle inequalities in risk minimization. (2004 IMS Medallion Lecture). (With discussions and rejoinder) (Q2373576) (← links)
- Learning with sample dependent hypothesis spaces (Q2389476) (← links)
- Accelerated gradient boosting (Q2425242) (← links)
- Statistical performance of support vector machines (Q2426613) (← links)
- Ranking and empirical minimization of \(U\)-statistics (Q2426626) (← links)
- Fast learning from \(\alpha\)-mixing observations (Q2443266) (← links)
- Convergence analysis of online algorithms (Q2454719) (← links)
- Simultaneous adaptation to the margin and to complexity in classification (Q2456017) (← links)
- Optimal rates of aggregation in classification under low noise assumption (Q2469663) (← links)
- On the rate of convergence for multi-category classification based on convex losses (Q2475308) (← links)
- Square root penalty: Adaption to the margin in classification and in edge estimation (Q2569239) (← links)
- Complexities of convex combinations and bounding the generalization error in classification (Q2583410) (← links)
- Boosting with early stopping: convergence and consistency (Q2583412) (← links)
- Classifiers of support vector machine type with \(\ell_1\) complexity regularization (Q2642804) (← links)
- On the Optimality of Sample-Based Estimates of the Expectation of the Empirical Minimizer (Q3085585) (← links)
- Theory of Classification: a Survey of Some Recent Advances (Q3373749) (← links)
- FAST RATES FOR ESTIMATION ERROR AND ORACLE INEQUALITIES FOR MODEL SELECTION (Q3632389) (← links)
- Consistency and convergence rate for nearest subspace classifier (Q4603717) (← links)
- (Q4998897) (← links)
- Learning with Convex Loss and Indefinite Kernels (Q5378314) (← links)
- Learning Theory Estimates with Observations from General Stationary Stochastic Processes (Q5380606) (← links)
- Optimization by Gradient Boosting (Q5870986) (← links)
- Infinitesimal gradient boosting (Q6123287) (← links)