Leave-One-Out Bounds for Support Vector Regression Model Selection
From MaRDI portal
Publication:4678450
DOI10.1162/0899766053491869zbMATH Open1096.68128OpenAlexW2106701503WikidataQ60241071 ScholiaQ60241071MaRDI QIDQ4678450FDOQ4678450
Authors: Ming-Wei Chang, Chih-Jen Lin
Publication date: 23 May 2005
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1162/0899766053491869
Recommendations
- Leave one support vector out cross validation for fast estimation of generalization errors
- Efficient leave-\(m\)-out cross-validation of support vector regression by generalizing decremental algorithm
- Efficient Computation and Model Selection for the Support Vector Regression
- SVM Model Selection with the VC Bound
- Fast exact leave-one-out cross-validation of sparse least-squares support vector machines
- A novel approach of model selection for support vector machines
- Advances in Neural Networks – ISNN 2005
- Selection methods for extended least squares support vector machines
Cites Work
- Formulations of support vector machines: A note from an optimization point of view
- Choosing multiple parameters for support vector machines
- A probabilistic framework for SVM regression and error bar estimation
- Radius Margin Bounds for Support Vector Machines with the RBF Kernel
- Training v-Support Vector Regression: Theory and Algorithms
- Optimization Problems with Perturbations: A Guided Tour
Cited In (11)
- Backward elimination model construction for regression and classification using leave-one-out criteria
- Leave-One-Out Bounds for Kernel Methods
- Efficient optimization of hyper-parameters for least squares support vector regression
- Radius Margin Bounds for Support Vector Machines with the RBF Kernel
- Efficient sparse least squares support vector machines for pattern classification
- On linear programs with linear complementarity constraints
- An instrumental least squares support vector machine for nonlinear system identification
- Efficient leave-\(m\)-out cross-validation of support vector regression by generalizing decremental algorithm
- Title not available (Why is that?)
- Title not available (Why is that?)
- Efficient Computation and Model Selection for the Support Vector Regression
Uses Software
This page was built for publication: Leave-One-Out Bounds for Support Vector Regression Model Selection
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4678450)