Robustness of reweighted least squares kernel based regression
From MaRDI portal
Recommendations
- Consistency and robustness of kernel-based regression in convex risk minimization
- Weighted least squares support vector machines: robustness and sparse approximation
- An RKHS approach to robust functional linear regression
- Robust regression function estimation
- Bouligand derivatives and robustness of support vector machines for regression
Cites work
- scientific article; zbMATH DE number 5968941 (Why is no real title available?)
- scientific article; zbMATH DE number 1804115 (Why is no real title available?)
- scientific article; zbMATH DE number 3954047 (Why is no real title available?)
- scientific article; zbMATH DE number 45848 (Why is no real title available?)
- scientific article; zbMATH DE number 3551792 (Why is no real title available?)
- scientific article; zbMATH DE number 823069 (Why is no real title available?)
- 10.1162/153244302760200704
- 10.1162/1532443041827925
- Bayesian Framework for Least-Squares Support Vector Machine Classifiers, Gaussian Processes, and Kernel Fisher Discriminant Analysis
- Consistency and robustness of kernel-based regression in convex risk minimization
- Influence Functions of Iteratively Reweighted Least Squares Estimators
- Multivariate adaptive regression splines
- On One-Step GM Estimates and Stability of Inferences in Linear Regression
- On robustness properties of convex risk minimization methods for pattern recognition
- Period Analysis of Variable Stars by Robust Smoothing
- Regularization networks and support vector machines
- Robust Statistics
- Some properties of regularized kernel methods
- Support Vector Machines
- Weighted least squares support vector machines: robustness and sparse approximation
Cited in
(15)- Distributed robust regression with correntropy losses and regularization kernel networks
- Asymmetric least squares support vector machine classifiers
- An exponential-type kernel robust regression model for interval-valued variables
- Robust kernel-based distribution regression
- Gradient descent for robust kernel-based regression
- Learning rates for the kernel regularized regression with a differentiable strongly convex loss
- Consistency and robustness of kernel-based regression in convex risk minimization
- Robust weighted kernel logistic regression in imbalanced and rare events data
- Primal and dual model representations in kernel-based learning
- Fast rates of minimum error entropy with heavy-tailed noise
- Optimality of robust online learning
- Robustness by reweighting for kernel estimators: an overview
- Robust support vector machines for classification with nonconvex and smooth losses
- Trading Variance Reduction with Unbiasedness: The Regularized Subspace Information Criterion for Robust Model Selection in Kernel Regression
- Kernel-based maximum correntropy criterion with gradient descent method
This page was built for publication: Robustness of reweighted least squares kernel based regression
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1049548)