Leave-One-Out Bounds for Support Vector Regression Model Selection
From MaRDI portal
Publication:4678450
DOI10.1162/0899766053491869zbMath1096.68128WikidataQ60241071 ScholiaQ60241071MaRDI QIDQ4678450
Publication date: 23 May 2005
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1162/0899766053491869
68T05: Learning and adaptive systems in artificial intelligence
Related Items
Efficient sparse least squares support vector machines for pattern classification, On linear programs with linear complementarity constraints, An instrumental least squares support vector machine for nonlinear system identification, Efficient optimization of hyper-parameters for least squares support vector regression, Efficient Computation and Model Selection for the Support Vector Regression
Uses Software
Cites Work
- Formulations of Support Vector Machines: A Note from an Optimization Point of View
- Training v-Support Vector Regression: Theory and Algorithms
- Optimization Problems with Perturbations: A Guided Tour
- Radius Margin Bounds for Support Vector Machines with the RBF Kernel
- A probabilistic framework for SVM regression and error bar estimation
- Choosing multiple parameters for support vector machines