Efficient Computation and Model Selection for the Support Vector Regression
From MaRDI portal
Publication:3593966
DOI10.1162/neco.2007.19.6.1633zbMath1119.68150OpenAlexW1979859909WikidataQ51626930 ScholiaQ51626930MaRDI QIDQ3593966
Publication date: 6 August 2007
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1162/neco.2007.19.6.1633
Learning and adaptive systems in artificial intelligence (68T05) Problem solving in the context of artificial intelligence (heuristics, search strategies, etc.) (68T20)
Related Items
The regularization paths for the ROC-optimizing support vector machines, Incremental learning for \(\nu\)-support vector regression, A regularization path algorithm for support vector ordinal regression, Generalized Kalman smoothing: modeling and algorithms, Multi-parametric solution-path algorithm for instance-weighted support vector machines, Feasible generalized least squares using support vector regression, On the ``degrees of freedom of the lasso, Approximate penalization path for smoothly clipped absolute deviation, An algebraic characterization of the optimum of regularized kernel methods
Uses Software
Cites Work
- Estimation of the mean of a multivariate normal distribution
- Multivariate adaptive regression splines
- Smoothing noisy data with spline functions: Estimating the correct degree of smoothing by the method of generalized cross-validation
- On the degrees of freedom in shape-restricted regression.
- Some results on Tchebycheffian spline functions and stochastic processes
- Training v-Support Vector Regression: Theory and Algorithms
- How Biased is the Apparent Error Rate of a Prediction Rule?
- Reoptimization With the Primal-Dual Interior Point Method
- Leave-One-Out Bounds for Support Vector Regression Model Selection
- Accurate On-line Support Vector Regression
- The Estimation of Prediction Error