Consistency of Support Vector Machines and Other Regularized Kernel Classifiers
From MaRDI portal
Publication:3547665
DOI10.1109/TIT.2004.839514zbMATH Open1304.62090WikidataQ59196431 ScholiaQ59196431MaRDI QIDQ3547665FDOQ3547665
Authors: Ingo Steinwart
Publication date: 21 December 2008
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Recommendations
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Learning and adaptive systems in artificial intelligence (68T05)
Cited In (74)
- Nonparametric augmented probability weighting with sparsity
- Comment
- Learning performance of uncentered kernel-based principal component analysis
- Structure learning via unstructured kernel-based M-estimation
- Posterior consistency of semi-supervised regression on graphs
- Support vector machine in big data: smoothing strategy and adaptive distributed inference
- Comment
- Sparse additive support vector machines in bounded variation space
- Linear twin quadratic surface support vector regression
- Error analysis of classification learning algorithms based on LUMs loss
- Bayesian additive machine: classification with a semiparametric discriminant function
- Optimal estimation for large-eddy simulation of turbulence and application to the analysis of subgrid models
- Title not available (Why is that?)
- Learning with mitigating random consistency from the accuracy measure
- Generalization performance of Lagrangian support vector machine based on Markov sampling
- Robust learning from bites for data mining
- Does modeling lead to more accurate classification? A study of relative efficiency in linear classification
- Universal consistency of localized versions of regularized kernel methods
- Algorithmic Learning Theory
- Estimating individualized treatment rules using outcome weighted learning
- Analysis of support vector machine classification
- Robustness and generalization
- On the consistency of the bootstrap approach for support vector machines and related kernel-based methods
- Efficiency of classification methods based on empirical risk minimization
- A signal theory approach to support vector classification: the sinc kernel
- A consistency result for functional SVM by spline interpolation
- Demonstrating the stability of support vector machines for classification
- Consistency and convergence rates of one-class SVMs and related algorithms
- Kernel variable selection for multicategory support vector machines
- Oracle properties of SCAD-penalized support vector machine
- Consistency and robustness of kernel-based regression in convex risk minimization
- Proximal activation of smooth functions in splitting algorithms for convex image recovery
- Local Rademacher complexities and oracle inequalities in risk minimization. (2004 IMS Medallion Lecture). (With discussions and rejoinder)
- Statistical analysis of kernel-based least-squares density-ratio estimation
- On the rate of convergence for multi-category classification based on convex losses
- The consistency of multicategory support vector machines
- Variational analysis of constrained M-estimators
- Classifiers of support vector machine type with \(\ell_1\) complexity regularization
- Classification with support vector machines and Kolmogorov-Smirnov bounds
- A consistent information criterion for support vector machines in diverging model spaces
- On the influence of the kernel on the consistency of support vector machines
- When can support vector machine achieve fast rates of convergence?
- Fast rates for support vector machines using Gaussian kernels
- A Bahadur representation of the linear support vector machine
- Simulation-based classification; a model-order-reduction approach for structural health monitoring
- A Fisher consistent multiclass loss function with variable margin on positive examples
- A two-experiment approach to Wiener system identification
- Learning sparse conditional distribution: an efficient kernel-based approach
- Consistency of support vector machines using additive kernels for additive models
- Support vector machines are universally consistent
- On qualitative robustness of support vector machines
- Robustness and regularization of support vector machines
- Asymptotic normality of support vector machine variants and other regularized kernel methods
- Advances in large-margin classifiers
- Generalization performance of least-square regularized regression algorithm with Markov chain samples
- Distribution-free consistency of empirical risk minimization and support vector regression
- Coefficient-based regularization network with variance loss for error
- Robustness of learning algorithms using hinge loss with outlier indicators
- Robust support vector machines for classification with nonconvex and smooth losses
- Calibrated asymmetric surrogate losses
- Statistical performance of support vector machines
- On the consistency of multi-label learning
- Function Classes That Approximate the Bayes Risk
- Some properties of regularized kernel methods
- Classification in general finite dimensional spaces with the \(k\)-nearest neighbor rule
- Quantitative convergence analysis of kernel based large-margin unified machines
- The new interpretation of support vector machines on statistical learning theory
- Divide-and-conquer for debiased \(l_1\)-norm support vector machine in ultra-high dimensions
- A new analytical approach to consistency and overfitting in regularized empirical risk minimization
- Learning from dependent observations
- On surrogate loss functions and \(f\)-divergences
- Theory of Classification: a Survey of Some Recent Advances
- A note on stability of error bounds in statistical learning theory
- Relaxing support vectors for classification
Uses Software
This page was built for publication: Consistency of Support Vector Machines and Other Regularized Kernel Classifiers
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3547665)