Pages that link to "Item:Q1848928"
From MaRDI portal
The following pages link to Empirical margin distributions and bounding the generalization error of combined classifiers (Q1848928):
Displaying 50 items.
- Generalization bounds for metric and similarity learning (Q255367) (← links)
- Consistency and generalization bounds for maximum entropy density estimation (Q280753) (← links)
- Robustness and generalization (Q420915) (← links)
- Generalization error bounds for the logical analysis of data (Q427877) (← links)
- Further results on the margin explanation of boosting: new algorithm and experiments (Q439827) (← links)
- Boosting algorithms: regularization, prediction and model fitting (Q449780) (← links)
- Structural multiple empirical kernel learning (Q528736) (← links)
- The value of agreement a new boosting algorithm (Q927875) (← links)
- New multicategory boosting algorithms based on multicategory Fisher-consistent losses (Q999662) (← links)
- Vote counting measures for ensemble classifiers. (Q1425964) (← links)
- Concentration inequalities using the entropy method (Q1431503) (← links)
- Relative deviation learning bounds and generalization with unbounded loss functions (Q1714946) (← links)
- Bootstrap -- an exploration (Q1731214) (← links)
- Robust multicategory support vector machines using difference convex algorithm (Q1749454) (← links)
- A note on margin-based loss functions in classification (Q1770065) (← links)
- Bounding the generalization error of convex combinations of classifiers: Balancing the dimensionality and the margins. (Q1872344) (← links)
- Complexity regularization via localized random penalties (Q1879970) (← links)
- On the Bayes-risk consistency of regularized boosting methods. (Q1884602) (← links)
- Optimal aggregation of classifiers in statistical learning. (Q1884608) (← links)
- Transfer bounds for linear feature learning (Q1959489) (← links)
- Fast generalization error bound of deep learning without scale invariance of activation functions (Q2055056) (← links)
- Interpretable machine learning: fundamental principles and 10 grand challenges (Q2074414) (← links)
- A statistical learning perspective on switched linear system identification (Q2081825) (← links)
- AdaBoost and robust one-bit compressed sensing (Q2102435) (← links)
- Influence diagnostics in support vector machines (Q2131934) (← links)
- Noisy tensor completion via the sum-of-squares hierarchy (Q2144539) (← links)
- A precise high-dimensional asymptotic theory for boosting and minimum-\(\ell_1\)-norm interpolated classifiers (Q2148995) (← links)
- MREKLM: a fast multiple empirical kernel learning machine (Q2289596) (← links)
- Estimation bounds and sharp oracle inequalities of regularized procedures with Lipschitz loss functions (Q2313281) (← links)
- Random projections as regularizers: learning a linear discriminant from fewer observations than dimensions (Q2353006) (← links)
- Local Rademacher complexities and oracle inequalities in risk minimization. (2004 IMS Medallion Lecture). (With discussions and rejoinder) (Q2373576) (← links)
- Complexity of pattern classes and the Lipschitz property (Q2381580) (← links)
- \(L_{p}\)-norm Sauer-Shelah lemma for margin multi-category classifiers (Q2402375) (← links)
- Statistical performance of support vector machines (Q2426613) (← links)
- Ranking and empirical minimization of \(U\)-statistics (Q2426626) (← links)
- Concentration inequalities and asymptotic results for ratio type empirical processes (Q2497173) (← links)
- Square root penalty: Adaption to the margin in classification and in edge estimation (Q2569239) (← links)
- Complexities of convex combinations and bounding the generalization error in classification (Q2583410) (← links)
- Boosting with early stopping: convergence and consistency (Q2583412) (← links)
- Classifiers of support vector machine type with \(\ell_1\) complexity regularization (Q2642804) (← links)
- A Vector-Contraction Inequality for Rademacher Complexities (Q2830263) (← links)
- Learning with Rejection (Q2830268) (← links)
- Structural Online Learning (Q2830279) (← links)
- Learning with Deep Cascades (Q2835634) (← links)
- Analysis of the generalization ability of a full decision tree (Q2940504) (← links)
- Theory of Classification: a Survey of Some Recent Advances (Q3373749) (← links)
- (Q4558184) (← links)
- Boosting in the Presence of Outliers: Adaptive Classification With Nonconvex Loss Functions (Q4962433) (← links)
- Structure from Randomness in Halfspace Learning with the Zero-One Loss (Q5139592) (← links)
- Guaranteed Classification via Regularized Similarity Learning (Q5378332) (← links)