Fast exact leave-one-out cross-validation of sparse least-squares support vector machines
From MaRDI portal
Publication:705597
DOI10.1016/J.NEUNET.2004.07.002zbMATH Open1073.68072DBLPjournals/nn/CawleyT04OpenAlexW2069659771WikidataQ80997718 ScholiaQ80997718MaRDI QIDQ705597FDOQ705597
Authors: Gavin C. Cawley, Nicola L. C. Talbot
Publication date: 31 January 2005
Published in: Neural Networks (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.neunet.2004.07.002
Recommendations
- Leave one support vector out cross validation for fast estimation of generalization errors
- Fast cross-validation algorithms for least squares support vector machine and kernel ridge regression
- Efficient leave-\(m\)-out cross-validation of support vector regression by generalizing decremental algorithm
- Fast Generalized Cross-Validation Algorithm for Sparse Model Learning
- Efficient cross-validation for kernelized least-squares regression with sparse basis expansions
- Fast computation of cross-validated properties in full linear leave-many-out procedures
- A sparse least squares support vector machine
- A novel sparse least squares support vector machines
- Efficient approximate leave-one-out cross-validation for kernel logistic regression
- Efficient sparse least squares support vector machines for pattern classification
Cites Work
- The elements of statistical learning. Data mining, inference, and prediction
- Regularization algorithms for learning that are equivalent to multilayer networks
- Title not available (Why is that?)
- Ridge Regression: Biased Estimation for Nonorthogonal Problems
- A Simplex Method for Function Minimization
- Title not available (Why is that?)
- Hedonic housing prices and the demand for clean air
- Some results on Tchebycheffian spline functions and stochastic processes
- Title not available (Why is that?)
- Title not available (Why is that?)
- Adjustment of an Inverse Matrix Corresponding to a Change in One Element of a Given Matrix
- An Inverse Matrix Adjustment Arising in Discriminant Analysis
- Updating the Inverse of a Matrix
- The Relationship between Variable Selection and Data Agumentation and a Method for Prediction
- Title not available (Why is that?)
- Interpolation of scattered data: distance matrices and conditionally positive definite functions
- Choosing multiple parameters for support vector machines
- Chaos control using least-squares support vector machines
- Weighted least squares support vector machines: robustness and sparse approximation
- Improved sparse least-squares support vector machines
- Title not available (Why is that?)
- Title not available (Why is that?)
- 10.1162/15324430260185637
Cited In (19)
- Nuclear discrepancy for single-shot batch active learning
- Fast Generalized Cross-Validation Algorithm for Sparse Model Learning
- Fast computation of cross-validated properties in full linear leave-many-out procedures
- A least-squares method for sparse low rank approximation of multivariate functions
- Efficient approximate leave-one-out cross-validation for kernel logistic regression
- Leave-One-Out Bounds for Support Vector Regression Model Selection
- Efficient cross-validation for kernelized least-squares regression with sparse basis expansions
- A multiscale method for semi-linear elliptic equations with localized uncertainties and non-linearities
- \textsf{StreaMRAK} a streaming multi-resolution adaptive kernel algorithm
- Leave one support vector out cross validation for fast estimation of generalization errors
- Efficient sparse least squares support vector machines for pattern classification
- Low rank updated LS-SVM classifiers for fast variable selection
- Model selection for the LS-SVM. Application to handwriting recognition
- Efficient approximate \(k\)-fold and leave-one-out cross-validation for ridge regression
- FEATURE SELECTION VIA LEAST SQUARES SUPPORT FEATURE MACHINE
- Fast cross-validation algorithms for least squares support vector machine and kernel ridge regression
- Training sparse least squares support vector machines by the QR decomposition
- Efficient leave-\(m\)-out cross-validation of support vector regression by generalizing decremental algorithm
- Optimized fixed-size kernel models for large data sets
Uses Software
This page was built for publication: Fast exact leave-one-out cross-validation of sparse least-squares support vector machines
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q705597)