Training v-Support Vector Regression: Theory and Algorithms
From MaRDI portal
Publication:3149528
DOI10.1162/089976602760128081zbMath1010.68114MaRDI QIDQ3149528
Chih-Chung Chang, Chih-Jen Lin
Publication date: 25 September 2002
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1162/089976602760128081
68T05: Learning and adaptive systems in artificial intelligence
Related Items
Leave-One-Out Bounds for Support Vector Regression Model Selection, Accurate On-line Support Vector Regression, Adaptive Two-Pass Median Filter Based on Support Vector Machines for Image Restoration, The maximum vector-angular margin classifier and its fast training on large datasets using a core vector machine, Noise model based \(\nu\)-support vector regression with its application to short-term wind speed forecasting, Distribution-free consistency of empirical risk minimization and support vector regression, Incremental learning for \(\nu\)-support vector regression, European exchange trading funds trading with locally weighted support vector regression, Maximum likelihood optimal and robust support vector regression with \textit{lncosh} loss function, Kernel ridge regression model based on beta-noise and its application in short-term wind speed forecasting, Robust ν-support vector machine based on worst-case conditional value-at-risk minimization, Efficient Computation and Model Selection for the Support Vector Regression
Cites Work
- Training v-Support Vector Classifiers: Theory and Algorithms
- Efficient SVM regression training with SMO
- A simple decomposition method for support vector machines
- Feasible direction decomposition algorithms for training support vector machines
- Convergence of a generalized SMO algorithm for SVM classifier design