Pages that link to "Item:Q1848780"
From MaRDI portal
The following pages link to Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors) (Q1848780):
Displaying 50 items.
- Quadratic Majorization for Nonconvex Loss with Applications to the Boosting Algorithm (Q82722) (← links)
- Robust boosting with truncated loss functions (Q82723) (← links)
- Model-based boosting in R: a hands-on tutorial using the R package mboost (Q110461) (← links)
- Nearly unbiased variable selection under minimax concave penalty (Q117379) (← links)
- Greedy function approximation: A gradient boosting machine. (Q127532) (← links)
- Tree-structured modelling of categorical predictors in generalized additive regression (Q137407) (← links)
- A new hybrid classification algorithm for customer churn prediction based on logistic regression and decision trees (Q138141) (← links)
- Ensemble classification based on generalized additive models (Q151094) (← links)
- A conversation with Jerry Friedman (Q254457) (← links)
- Learning ELM-tree from big data based on uncertainty reduction (Q277424) (← links)
- Finding causative genes from high-dimensional data: an appraisal of statistical and machine learning approaches (Q309421) (← links)
- An empirical comparison of classification algorithms for mortgage default prediction: evidence from a distressed mortgage market (Q320966) (← links)
- Cost-sensitive boosting algorithms: do we really need them? (Q331693) (← links)
- Mean and quantile boosting for partially linear additive models (Q340847) (← links)
- Improved nearest neighbor classifiers by weighting and selection of predictors (Q340856) (← links)
- On hybrid classification using model assisted posterior estimates (Q408060) (← links)
- Blasso for object categorization and retrieval: towards interpretable visual models (Q408074) (← links)
- A simple extension of boosting for asymmetric mislabeled data (Q419240) (← links)
- Further results on the margin explanation of boosting: new algorithm and experiments (Q439827) (← links)
- Kullback-Leibler aggregation and misspecified generalized linear models (Q447818) (← links)
- Boosting algorithms: regularization, prediction and model fitting (Q449780) (← links)
- Comment on: Boosting algorithms: regularization, prediction and model fitting (Q449783) (← links)
- Rejoinder: Boosting algorithms: regularization, prediction and model fitting (Q449785) (← links)
- Functional gradient ascent for probit regression (Q454437) (← links)
- A noise-detection based AdaBoost algorithm for mislabeled data (Q454443) (← links)
- Fully corrective boosting with arbitrary loss and regularization (Q460675) (← links)
- On a method for constructing ensembles of regression models (Q462080) (← links)
- Sparse regression and support recovery with \(\mathbb{L}_2\)-boosting algorithms (Q466526) (← links)
- Does modeling lead to more accurate classification? A study of relative efficiency in linear classification (Q476236) (← links)
- A sharp nonasymptotic bound and phase diagram of \(L_{1/2}\) regularization (Q477827) (← links)
- Cox process functional learning (Q500875) (← links)
- Boosting conditional probability estimators (Q513339) (← links)
- Non-crossing large-margin probability estimation and its application to robust SVM via pre\-condi\-tion\-ing (Q537462) (← links)
- Remembering Leo Breiman (Q542912) (← links)
- Remembrance of Leo Breiman (Q542915) (← links)
- Node harvest (Q542973) (← links)
- \(L_{2}\) boosting in kernel regression (Q605014) (← links)
- Small area estimation of the homeless in Los Angeles: an application of cost-sensitive stochastic gradient boosting (Q614137) (← links)
- Survival ensembles by the sum of pairwise differences with application to lung cancer microarray studies (Q641157) (← links)
- A boosting method for maximization of the area under the ROC curve (Q645529) (← links)
- Representing and recognizing objects with massive local image patches (Q645863) (← links)
- A boosting approach for supervised Mahalanobis distance metric learning (Q645912) (← links)
- Entropy and divergence associated with power function and the statistical application (Q653348) (← links)
- Unsupervised weight parameter estimation method for ensemble learning (Q662147) (← links)
- Boosting in Cox regression: a comparison between the likelihood-based and the model-based approaches with focus on the R-packages \textit{CoxBoost} and \textit{mboost} (Q736636) (← links)
- Optimal prediction pools (Q738000) (← links)
- Robust exponential squared loss-based variable selection for high-dimensional single-index varying-coefficient model (Q738981) (← links)
- A weight-adjusted voting algorithm for ensembles of classifiers (Q743769) (← links)
- Iterative bias reduction: a comparative study (Q746347) (← links)
- Soft-max boosting (Q747255) (← links)