Consistency of Support Vector Machines and Other Regularized Kernel Classifiers
From MaRDI portal
Publication:3547665
Recommendations
Cited in
(74)- Bayesian additive machine: classification with a semiparametric discriminant function
- Nonparametric augmented probability weighting with sparsity
- Optimal estimation for large-eddy simulation of turbulence and application to the analysis of subgrid models
- Does modeling lead to more accurate classification? A study of relative efficiency in linear classification
- Generalization performance of Lagrangian support vector machine based on Markov sampling
- Learning with mitigating random consistency from the accuracy measure
- scientific article; zbMATH DE number 7306882 (Why is no real title available?)
- Robust learning from bites for data mining
- Universal consistency of localized versions of regularized kernel methods
- Algorithmic Learning Theory
- Robustness and generalization
- Estimating individualized treatment rules using outcome weighted learning
- Efficiency of classification methods based on empirical risk minimization
- A signal theory approach to support vector classification: the sinc kernel
- Analysis of support vector machine classification
- On the consistency of the bootstrap approach for support vector machines and related kernel-based methods
- Comment
- A consistency result for functional SVM by spline interpolation
- Demonstrating the stability of support vector machines for classification
- Consistency and convergence rates of one-class SVMs and related algorithms
- Kernel variable selection for multicategory support vector machines
- Oracle properties of SCAD-penalized support vector machine
- Consistency and robustness of kernel-based regression in convex risk minimization
- Proximal activation of smooth functions in splitting algorithms for convex image recovery
- Statistical analysis of kernel-based least-squares density-ratio estimation
- Local Rademacher complexities and oracle inequalities in risk minimization. (2004 IMS Medallion Lecture). (With discussions and rejoinder)
- On the rate of convergence for multi-category classification based on convex losses
- Learning performance of uncentered kernel-based principal component analysis
- The consistency of multicategory support vector machines
- Variational analysis of constrained M-estimators
- Classifiers of support vector machine type with \(\ell_1\) complexity regularization
- Classification with support vector machines and Kolmogorov-Smirnov bounds
- A consistent information criterion for support vector machines in diverging model spaces
- On the influence of the kernel on the consistency of support vector machines
- When can support vector machine achieve fast rates of convergence?
- Fast rates for support vector machines using Gaussian kernels
- Simulation-based classification; a model-order-reduction approach for structural health monitoring
- A Bahadur representation of the linear support vector machine
- Structure learning via unstructured kernel-based M-estimation
- A Fisher consistent multiclass loss function with variable margin on positive examples
- A two-experiment approach to Wiener system identification
- Learning sparse conditional distribution: an efficient kernel-based approach
- Consistency of support vector machines using additive kernels for additive models
- Support vector machines are universally consistent
- On qualitative robustness of support vector machines
- Posterior consistency of semi-supervised regression on graphs
- Asymptotic normality of support vector machine variants and other regularized kernel methods
- Robustness and regularization of support vector machines
- Support vector machine in big data: smoothing strategy and adaptive distributed inference
- Advances in large-margin classifiers
- Generalization performance of least-square regularized regression algorithm with Markov chain samples
- Distribution-free consistency of empirical risk minimization and support vector regression
- Robustness of learning algorithms using hinge loss with outlier indicators
- Coefficient-based regularization network with variance loss for error
- Robust support vector machines for classification with nonconvex and smooth losses
- Comment
- Calibrated asymmetric surrogate losses
- Classification in general finite dimensional spaces with the \(k\)-nearest neighbor rule
- Statistical performance of support vector machines
- On the consistency of multi-label learning
- Some properties of regularized kernel methods
- Quantitative convergence analysis of kernel based large-margin unified machines
- Function Classes That Approximate the Bayes Risk
- Sparse additive support vector machines in bounded variation space
- Linear twin quadratic surface support vector regression
- Error analysis of classification learning algorithms based on LUMs loss
- The new interpretation of support vector machines on statistical learning theory
- Divide-and-conquer for debiased \(l_1\)-norm support vector machine in ultra-high dimensions
- A new analytical approach to consistency and overfitting in regularized empirical risk minimization
- Learning from dependent observations
- On surrogate loss functions and \(f\)-divergences
- Relaxing support vectors for classification
- Theory of Classification: a Survey of Some Recent Advances
- A note on stability of error bounds in statistical learning theory
This page was built for publication: Consistency of Support Vector Machines and Other Regularized Kernel Classifiers
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3547665)