A robust regression framework with Laplace kernel-induced loss
From MaRDI portal
Publication:5380865
Recommendations
- Robust support vector regression with generic quadratic nonconvex \(\varepsilon\)-insensitive loss
- Robust support vector regression in the primal
- Training robust support vector regression with smooth non-convex loss function
- Kernel-based sparse regression with the correntropy-induced loss
- A Framework of Learning Through Empirical Gain Maximization
Cites work
- scientific article; zbMATH DE number 1332320 (Why is no real title available?)
- A DC programming approach for feature selection in support vector machines learning
- A Recursive Least M-Estimate Algorithm for Robust Adaptive Filtering in Impulsive Noise: Fast Algorithm and Convergence Performance Analysis
- Correntropy: Properties and Applications in Non-Gaussian Signal Processing
- DC approximation approaches for sparse optimization
- Estimating conditional quantiles with the help of the pinball loss
- Information theoretic learning. Renyi's entropy and kernel perspectives
- Learning with the maximum correntropy criterion induced losses for regression
- Ramp loss linear programming support vector machine
- Robust Statistics
- Robust support vector machines based on the rescaled hinge loss function
- Robust support vector regression in the primal
- TSVR: an efficient twin support vector machine for regression
- The C-loss function for pattern classification
- Weighted least squares support vector machines: robustness and sparse approximation
Cited in
(2)
This page was built for publication: A robust regression framework with Laplace kernel-induced loss
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5380865)