Convergence rate of SVM for kernel-based robust regression
DOI10.1142/S0219691319500048zbMath1408.41015OpenAlexW2896723012WikidataQ129126408 ScholiaQ129126408MaRDI QIDQ4626547
Shuhua Wang, Bao Huai Sheng, Zhen Long Chen
Publication date: 28 February 2019
Published in: International Journal of Wavelets, Multiresolution and Information Processing (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1142/s0219691319500048
convergence raterobust regressionsupport vector machinequasiconvex loss functionright directional derivative
Computational learning theory (68Q32) Convex programming (90C25) Hilbert spaces with reproducing kernels (= (proper) functional Hilbert spaces, including de Branges-Rovnyak and other structured spaces) (46E22) Approximation by other special function classes (41A30)
Related Items (5)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Regularized least square regression with unbounded and dependent sampling
- Integral operator approach to learning theory with unbounded sampling
- Convergence rate of the semi-supervised greedy algorithm
- Optimal learning rates for least squares regularized regression with unbounded sampling
- Estimating conditional quantiles with the help of the pinball loss
- A new comparison theorem on conditional quantiles
- Multi-kernel regularized classifiers
- Generalized convexity and optimization. Theory and applications
- Aspects of robust linear regression
- Homotopy continuation approaches for robust SV classification and regression
- Semi-supervised learning based on high density region estimation
- Generalization errors of Laplacian regularized least squares regression
- Conditional quantiles with varying Gaussians
- Learning sets with separating kernels
- The learning rates of regularized regression based on reproducing kernel Banach spaces
- The convergence rate of semi-supervised regression with quadratic loss
- On the mathematical foundations of learning
- Learning rate of magnitude-preserving regularization ranking with dependent samples
- Half supervised coefficient regularization for regression learning with unbounded sampling
- KERNEL METHODS FOR INDEPENDENCE MEASUREMENT WITH COEFFICIENT CONSTRAINTS
- The performance of semi-supervised Laplacian regularized regression with the least square loss
- The Remedian: A Robust Averaging Method for Large Data Sets
- Support Vector Machines
- Consistency of kernel-based quantile regression
- Robust Truncated Hinge Loss Support Vector Machines
- ESTIMATING THE APPROXIMATION ERROR IN LEARNING THEORY
- The consistency of least-square regularized regression with negative association sequence
- Theory of Reproducing Kernels
- Robust Statistics
This page was built for publication: Convergence rate of SVM for kernel-based robust regression