Pages that link to "Item:Q1807156"
From MaRDI portal
The following pages link to Boosting the margin: a new explanation for the effectiveness of voting methods (Q1807156):
Displaying 50 items.
- Deformation of log-likelihood loss function for multiclass boosting (Q1784701) (← links)
- BoostWofE: a new sequential weights of evidence model reducing the effect of conditional dependency (Q1789094) (← links)
- A novel margin-based measure for directed hill climbing ensemble pruning (Q1793078) (← links)
- Arcing classifiers. (With discussion) (Q1807115) (← links)
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors) (Q1848780) (← links)
- On weak base hypotheses and their implications for boosting regression and classification (Q1848929) (← links)
- Iterative Bayes (Q1870536) (← links)
- Top-down decision tree learning as information based boosting (Q1870539) (← links)
- Bounding the generalization error of convex combinations of classifiers: Balancing the dimensionality and the margins. (Q1872344) (← links)
- Generalization error of combined classifiers. (Q1872713) (← links)
- Generalization bounds for averaged classifiers (Q1879971) (← links)
- On approximating weighted sums with exponentially many terms (Q1880781) (← links)
- Population theory for boosting ensembles. (Q1884600) (← links)
- Process consistency for AdaBoost. (Q1884601) (← links)
- On the Bayes-risk consistency of regularized boosting methods. (Q1884602) (← links)
- Statistical behavior and consistency of classification methods based on convex risk minimization. (Q1884603) (← links)
- Optimal aggregation of classifiers in statistical learning. (Q1884608) (← links)
- Boosting random subspace method (Q1932108) (← links)
- Anytime classification for a pool of instances (Q1959521) (← links)
- On the equivalence of weak learnability and linear separability: new relaxations and efficient boosting algorithms (Q1959594) (← links)
- An analysis on the relationship between uncertainty and misclassification rate of classifiers (Q2023206) (← links)
- Interpretable machine learning: fundamental principles and 10 grand challenges (Q2074414) (← links)
- AdaBoost and robust one-bit compressed sensing (Q2102435) (← links)
- On the perceptron's compression (Q2106618) (← links)
- A precise high-dimensional asymptotic theory for boosting and minimum-\(\ell_1\)-norm interpolated classifiers (Q2148995) (← links)
- Multi-label optimal margin distribution machine (Q2183598) (← links)
- Propositionalization and embeddings: two sides of the same coin (Q2203327) (← links)
- Discussion of: ``Nonparametric regression using deep neural networks with ReLU activation function'' (Q2215716) (← links)
- Regularization method for predicting an ordinal response using longitudinal high-dimensional genomic data (Q2258452) (← links)
- GA-Ensemble: a genetic algorithm for robust ensembles (Q2259225) (← links)
- Supervised projection approach for boosting classifiers (Q2270792) (← links)
- Feature selection based on loss-margin of nearest neighbor classification (Q2270819) (← links)
- Computer science and decision theory (Q2271874) (← links)
- Multi-scale rois selection for classifying multi-spectral images (Q2308553) (← links)
- Random projections as regularizers: learning a linear discriminant from fewer observations than dimensions (Q2353006) (← links)
- Local discriminative distance metrics ensemble learning (Q2353755) (← links)
- Learning linear PCA with convex semi-definite programming (Q2373456) (← links)
- Data-driven decomposition for multi-class classification (Q2384949) (← links)
- Quadratic boosting (Q2384982) (← links)
- Recursive aggregation of estimators by the mirror descent algorithm with averaging (Q2432961) (← links)
- Empirical risk minimization is optimal for the convex aggregation problem (Q2435238) (← links)
- Simultaneous adaptation to the margin and to complexity in classification (Q2456017) (← links)
- Optimal third root asymptotic bounds in the statistical estimation of thresholds (Q2466687) (← links)
- Optimal rates of aggregation in classification under low noise assumption (Q2469663) (← links)
- An empirical study of using Rotation Forest to improve regressors (Q2470171) (← links)
- Analysis of boosting algorithms using the smooth margin function (Q2473080) (← links)
- From dynamic classifier selection to dynamic ensemble selection (Q2476581) (← links)
- Maximum patterns in datasets (Q2478429) (← links)
- An efficient modified boosting method for solving classification problems (Q2479397) (← links)
- An algorithmic theory of learning: Robust concepts and random projection (Q2499543) (← links)