On the robustness of regularized pairwise learning methods based on kernels
From MaRDI portal
Publication:325147
DOI10.1016/j.jco.2016.07.001zbMath1403.62089arXiv1510.03267OpenAlexW2963955020MaRDI QIDQ325147
Andreas Christmann, Ding-Xuan Zhou
Publication date: 17 October 2016
Published in: Journal of Complexity (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1510.03267
Related Items (8)
Unnamed Item ⋮ Learning rates for the kernel regularized regression with a differentiable strongly convex loss ⋮ Convergence of online pairwise regression learning with quadratic loss ⋮ Error analysis of kernel regularized pairwise learning with a strongly convex loss ⋮ Stability and optimization error of stochastic gradient descent for pairwise learning ⋮ Robust pairwise learning with Huber loss ⋮ Online pairwise learning algorithms with convex loss functions ⋮ On extension theorems and their connection to universal consistency in machine learning
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Consistency analysis of an empirical minimum error entropy algorithm
- Consistency of support vector machines using additive kernels for additive models
- On consistency and robustness properties of support vector machines for heavy-tailed distributions
- On qualitative robustness of support vector machines
- On McDiarmid's concentration inequality
- Qualitative robustness in abstract inference
- Convex analysis and measurable multifunctions
- Bootstrap methods: another look at the jackknife
- On robustness properties of bootstrap approximations
- Compactly supported positive definite radial functions
- Piecewise polynomial, positive definite and compactly supported radial functions of minimal degree
- Ranking and empirical minimization of \(U\)-statistics
- Consistency and robustness of kernel-based regression in convex risk minimization
- Learning rates for the risk of kernel-based quantile regression estimators in additive models
- Learning Theory
- Support Vector Machines
- The Influence Curve and Its Role in Robust Estimation
- Real Analysis and Probability
- Information Theoretic Learning
- Regularization schemes for minimum error entropy principle
- Distinguishability of Sets of Distributions
- A General Qualitative Definition of Robustness
- Robust Statistics
This page was built for publication: On the robustness of regularized pairwise learning methods based on kernels