Two smooth support vector machines for \(\varepsilon \)-insensitive regression
From MaRDI portal
Publication:1753072
DOI10.1007/s10589-017-9975-9zbMath1418.90255OpenAlexW2777871713MaRDI QIDQ1753072
Chun-Hsu Ko, Yuh-Jye Lee, Wei-Zhe Gu, Wei-Po Chen, Jein-Shan Chen
Publication date: 25 May 2018
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10589-017-9975-9
support vector machinesmoothing Newton algorithm\(\varepsilon \)-insensitive loss\(\varepsilon \)-smooth support vector regression
Related Items (1)
Uses Software
Cites Work
- Unnamed Item
- A coordinate gradient descent method for linearly constrained smooth optimization and support vector machines training
- A smoothing-type algorithm for solving system of inequalities
- Estimation of dependences based on empirical data. Transl. from the Russian by Samuel Kotz
- Multivariate adaptive regression splines
- Critical point theory and submanifold geometry
- A nonsmooth version of Newton's method
- Optimization and nonsmooth analysis
- Semismooth and Semiconvex Functions in Constrained Optimization
- SSVM: A smooth support vector machine for classification
- Benchmarking optimization software with performance profiles.
This page was built for publication: Two smooth support vector machines for \(\varepsilon \)-insensitive regression