Efficient cross-validation for kernelized least-squares regression with sparse basis expansions
From MaRDI portal
Publication:439007
DOI10.1007/s10994-012-5287-6zbMath1243.68246OpenAlexW2002643138MaRDI QIDQ439007
Hanna Suominen, Tapio Pahikkala, Jorma Boberg
Publication date: 31 July 2012
Published in: Machine Learning (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10994-012-5287-6
cross-validationkernel methodshold-outleast-squares support vector machineregularized least-squaressparse basis expansions
Related Items (1)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Fast exact leave-one-out cross-validation of sparse least-squares support vector machines
- Fast cross-validation algorithms for least squares support vector machine and kernel ridge regression
- Matrix representations, linear transformations, and kernels for disambiguation in natural language
- Inference for the generalization error
- An efficient algorithm for learning to rank from preference graphs
- Optimized fixed-size kernel models for large data sets
- Additive regularization trade-off: fusion of training and validation levels in kernel methods
- Efficient leave-\(m\)-out cross-validation of support vector regression by generalizing decremental algorithm
- Regularization Algorithms for Learning That Are Equivalent to Multilayer Networks
- Matrix Analysis
- A Statistical View of Some Chemometrics Regression Tools
- Ten More Years of Error Rate Research
- Kernel matching pursuit
This page was built for publication: Efficient cross-validation for kernelized least-squares regression with sparse basis expansions