Consistency and robustness of kernel-based regression in convex risk minimization
From MaRDI portal
Publication:2469652
Abstract: We investigate statistical properties for a broad class of modern kernel-based regression (KBR) methods. These kernel methods were developed during the last decade and are inspired by convex risk minimization in infinite-dimensional Hilbert spaces. One leading example is support vector regression. We first describe the relationship between the loss function of the KBR method and the tail of the response variable. We then establish the -risk consistency for KBR which gives the mathematical justification for the statement that these methods are able to ``learn. Then we consider robustness properties of such kernel methods. In particular, our results allow us to choose the loss function and the kernel to obtain computationally tractable and consistent KBR methods that have bounded influence functions. Furthermore, bounds for the bias and for the sensitivity curve, which is a finite sample version of the influence function, are developed, and the relationship between KBR and classical estimators is discussed.
Recommendations
- On robustness properties of convex risk minimization methods for pattern recognition
- Robustness of reweighted least squares kernel based regression
- Bouligand derivatives and robustness of support vector machines for regression
- On the robustness of regularized pairwise learning methods based on kernels
- Robust regression function estimation
Cites work
- scientific article; zbMATH DE number 1637270 (Why is no real title available?)
- scientific article; zbMATH DE number 3954047 (Why is no real title available?)
- scientific article; zbMATH DE number 45848 (Why is no real title available?)
- scientific article; zbMATH DE number 3577802 (Why is no real title available?)
- scientific article; zbMATH DE number 3576139 (Why is no real title available?)
- scientific article; zbMATH DE number 3637090 (Why is no real title available?)
- scientific article; zbMATH DE number 1332320 (Why is no real title available?)
- scientific article; zbMATH DE number 1950576 (Why is no real title available?)
- scientific article; zbMATH DE number 1476625 (Why is no real title available?)
- scientific article; zbMATH DE number 1843268 (Why is no real title available?)
- scientific article; zbMATH DE number 803215 (Why is no real title available?)
- 10.1162/1532443041827925
- A distribution-free theory of nonparametric regression
- An approach to model complex high-dimensional insurance data
- Aspects of robust linear regression
- Consistency of Support Vector Machines and Other Regularized Kernel Classifiers
- Consistency of kernel-based quantile regression
- Convex functions, monotone operators and differentiability.
- Function Classes That Approximate the Bayes Risk
- How to compare different loss functions and their risks
- Least Median of Squares Regression
- On robustness properties of convex risk minimization methods for pattern recognition
- On the influence of the kernel on the consistency of support vector machines
- Real Analysis and Probability
- Robust Statistics
- Some properties of regularized kernel methods
- Sums of independent Banach space valued random variables
- The Influence Curve and Its Role in Robust Estimation
Cited in
(46)- Robust kernel-based distribution regression
- An Invariance Property of Predictors in Kernel-Induced Hypothesis Spaces
- The performance of semi-supervised Laplacian regularized regression with the least square loss
- Bouligand derivatives and robustness of support vector machines for regression
- scientific article; zbMATH DE number 7370640 (Why is no real title available?)
- Prediction of dynamical time series using kernel based regression and smooth splines
- Feasible generalized least squares using support vector regression
- Distribution-free consistency of empirical risk minimization and support vector regression
- Gradient descent for robust kernel-based regression
- Estimation of the bandwidth parameter in Nadaraya-Watson kernel non-parametric regression based on universal threshold level
- Robust learning from bites for data mining
- Asymptotic normality of support vector machine variants and other regularized kernel methods
- Testing subspace restrictions in the presence of high dimensional nuisance parameters
- Detecting influential observations in kernel PCA
- A statistical learning assessment of Huber regression
- Optimality of robust online learning
- Identifying outliers using multiple kernel canonical correlation analysis with application to imaging genetics
- Error analysis on Hérmite learning with gradient data
- On a strategy to develop robust and simple tariffs from motor vehicle insurance data
- On qualitative robustness of support vector machines
- Deep learning theory of distribution regression with CNNs
- Robust pairwise learning with Huber loss
- Consistency of support vector machines for forecasting the evolution of an unknown ergodic dynamical system from observations with unknown noise
- Analysis of support vector machines regression
- Modification of the adaptive Nadaraya-Watson kernel method for nonparametric regression (simulation study)
- A two-experiment approach to Wiener system identification
- Loan pricing under estimation risk
- Learning from dependent observations
- Error analysis of the kernel regularized regression based on refined convex losses and RKBSs
- Kernel-Based Partial Permutation Test for Detecting Heterogeneous Functional Relationship
- Adaptive kernel methods using the balancing principle
- Privacy-preserving parametric inference: a case for robust statistics
- On the robustness of regularized pairwise learning methods based on kernels
- On robustness properties of convex risk minimization methods for pattern recognition
- Distributed robust regression with correntropy losses and regularization kernel networks
- Robustness by reweighting for kernel estimators: an overview
- Convergence analysis for kernel-regularized online regression associated with an RRKHS
- A review on consistency and robustness properties of support vector machines for heavy-tailed distributions
- Robustness of reweighted least squares kernel based regression
- Robust nonparametric kernel regression estimator
- Learning rates for the kernel regularized regression with a differentiable strongly convex loss
- Frameworks and results in distributionally robust optimization
- Fast rates of minimum error entropy with heavy-tailed noise
- Learning with convex loss and indefinite kernels
- Trading Variance Reduction with Unbiasedness: The Regularized Subspace Information Criterion for Robust Model Selection in Kernel Regression
- Performance analysis of the LapRSSLG algorithm in learning theory
This page was built for publication: Consistency and robustness of kernel-based regression in convex risk minimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2469652)