Homotopy continuation approaches for robust SV classification and regression
From MaRDI portal
(Redirected from Publication:1697922)
Nonparametric robustness (62G35) Classification and discrimination; cluster analysis (statistical aspects) (62H30) General nonlinear regression (62J02) Learning and adaptive systems in artificial intelligence (68T05) Applications of mathematical programming (90C90) Nonconvex programming, global optimization (90C26)
Abstract: In support vector machine (SVM) applications with unreliable data that contains a portion of outliers, non-robustness of SVMs often causes considerable performance deterioration. Although many approaches for improving the robustness of SVMs have been studied, two major challenges remain in robust SVM learning. First, robust learning algorithms are essentially formulated as non-convex optimization problems. It is thus important to develop a non-convex optimization method for robust SVM that can find a good local optimal solution. The second practical issue is how one can tune the hyperparameter that controls the balance between robustness and efficiency. Unfortunately, due to the non-convexity, robust SVM solutions with slightly different hyper-parameter values can be significantly different, which makes model selection highly unstable. In this paper, we address these two issues simultaneously by introducing a novel homotopy approach to non-convex robust SVM learning. Our basic idea is to introduce parametrized formulations of robust SVM which bridge the standard SVM and fully robust SVM via the parameter that represents the influence of outliers. We characterize the necessary and sufficient conditions of the local optimal solutions of robust SVM, and develop an algorithm that can trace a path of local optimal solutions when the influence of outliers is gradually decreased. An advantage of our homotopy approach is that it can be interpreted as simulated annealing, a common approach for finding a good local optimal solution in non-convex optimization problems. In addition, our homotopy method allows stable and efficient model selection based on the path of local optimal solutions. Empirical performances of the proposed approach are demonstrated through intensive numerical experiments both on robust classification and regression problems.
Recommendations
- Convergence rate of SVM for kernel-based robust regression
- Robust support vector regression in the primal
- Robust support vector machines for classification and computational issues
- Training robust support vector regression with smooth non-convex loss function
- Robust support vector machines for classification with nonconvex and smooth losses
Cites work
- scientific article; zbMATH DE number 5957245 (Why is no real title available?)
- scientific article; zbMATH DE number 434714 (Why is no real title available?)
- scientific article; zbMATH DE number 3850845 (Why is no real title available?)
- scientific article; zbMATH DE number 1215253 (Why is no real title available?)
- scientific article; zbMATH DE number 1332320 (Why is no real title available?)
- scientific article; zbMATH DE number 2107836 (Why is no real title available?)
- 10.1162/153244303321897690
- Algorithmics for hard problems.
- Approximating parameterized convex optimization problems
- Multicategory ψ-Learning
- Nearly unbiased variable selection under minimax concave penalty
- On ψ-Learning
- Robust Truncated Hinge Loss Support Vector Machines
- SparseNet: coordinate descent with nonconvex penalties
- Support-vector networks
- The Concave-Convex Procedure
Cited in
(3)
This page was built for publication: Homotopy continuation approaches for robust SV classification and regression
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1697922)