Calibration of \(\epsilon\)-insensitive loss in support vector machines regression
From MaRDI portal
Publication:1730072
DOI10.1016/j.jfranklin.2018.11.021zbMath1448.68377OpenAlexW2911480375MaRDI QIDQ1730072
Publication date: 11 March 2019
Published in: Journal of the Franklin Institute (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jfranklin.2018.11.021
Nonparametric regression and quantile regression (62G08) Learning and adaptive systems in artificial intelligence (68T05)
Related Items (2)
Discrete gradient-zeroing neural network algorithms for handling future quadratic program as well as robot arm via ten-instant formula ⋮ Convergence rates of support vector machines regression for functional data
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Approximation analysis of learning algorithms for support vector regression and quantile regression
- Estimating conditional quantiles with the help of the pinball loss
- Learning with varying insensitive loss
- Oracle inequalities for support vector machines that are based on random entropy numbers
- Multi-kernel regularized classifiers
- Analysis of support vector machines regression
- Conditional quantiles with varying Gaussians
- Regularization networks and support vector machines
- Embedded feature-selection support vector machine for driving pattern recognition
- Local Rademacher complexities
- Breast cancer detection from FNA using SVM with different parameter tuning systems and SOM-RBF classifier
- How to compare different loss functions and their risks
- Support Vector Machines
- Convexity, Classification, and Risk Bounds
- Theory of Reproducing Kernels
- Robust Statistics
This page was built for publication: Calibration of \(\epsilon\)-insensitive loss in support vector machines regression