Convergence rate of SVM for kernel-based robust regression
DOI10.1142/S0219691319500048zbMATH Open1408.41015OpenAlexW2896723012WikidataQ129126408 ScholiaQ129126408MaRDI QIDQ4626547FDOQ4626547
Authors: Shuhua Wang, Bao-Huai Sheng, Zhenlong Chen
Publication date: 28 February 2019
Published in: International Journal of Wavelets, Multiresolution and Information Processing (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1142/s0219691319500048
Recommendations
- Learning rates of kernel-based robust classification
- Robust support vector regression in the primal
- Robust support vector regression with generic quadratic nonconvex \(\varepsilon\)-insensitive loss
- Training robust support vector regression with smooth non-convex loss function
- Homotopy continuation approaches for robust SV classification and regression
robust regressionsupport vector machineconvergence ratequasiconvex loss functionright directional derivative
Convex programming (90C25) Computational learning theory (68Q32) Approximation by other special function classes (41A30) Hilbert spaces with reproducing kernels (= (proper) functional Hilbert spaces, including de Branges-Rovnyak and other structured spaces) (46E22)
Cites Work
- Theory of Reproducing Kernels
- Support Vector Machines
- Robust Truncated Hinge Loss Support Vector Machines
- Title not available (Why is that?)
- Robust Statistics
- On the mathematical foundations of learning
- Title not available (Why is that?)
- Support vector machine soft margin classifiers: error analysis
- Universal kernels
- Conditional quantiles with varying Gaussians
- Optimal learning rates for least squares regularized regression with unbounded sampling
- Estimating conditional quantiles with the help of the pinball loss
- Multi-kernel regularized classifiers
- Generalized convexity and optimization. Theory and applications
- ESTIMATING THE APPROXIMATION ERROR IN LEARNING THEORY
- Half supervised coefficient regularization for regression learning with unbounded sampling
- Regularized least square regression with unbounded and dependent sampling
- Integral operator approach to learning theory with unbounded sampling
- The Remedian: A Robust Averaging Method for Large Data Sets
- Semi-supervised learning based on high density region estimation
- Convergence rate of the semi-supervised greedy algorithm
- Aspects of robust linear regression
- Consistency of kernel-based quantile regression
- A new comparison theorem on conditional quantiles
- The learning rates of regularized regression based on reproducing kernel Banach spaces
- The consistency of least-square regularized regression with negative association sequence
- Convergence rate of semi-supervised gradient learning algorithms
- The performance of semi-supervised Laplacian regularized regression with the least square loss
- Homotopy continuation approaches for robust SV classification and regression
- Kernel methods for independence measurement with coefficient constraints
- Generalization errors of Laplacian regularized least squares regression
- Learning sets with separating kernels
- The convergence rate of semi-supervised regression with quadratic loss
- Learning rate of magnitude-preserving regularization ranking with dependent samples
- Asymptotic analysis of quantile regression learning based on coefficient dependent regularization
Cited In (8)
- Gradient descent for robust kernel-based regression
- Learning rates for the kernel regularized regression with a differentiable strongly convex loss
- On the \(K\)-functional in learning theory
- Learning rates of kernel-based robust classification
- Convergence of online pairwise regression learning with quadratic loss
- Robust support vector regression in the primal
- Coefficient-based regularization network with variance loss for error
- Homotopy continuation approaches for robust SV classification and regression
This page was built for publication: Convergence rate of SVM for kernel-based robust regression
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4626547)