A semismooth Newton method for support vector classification and regression
From MaRDI portal
Publication:2419554
Abstract: Support vector machine is an important and fundamental technique in machine learning. In this paper, we apply a semismooth Newton method to solve two typical SVM models: the L2-loss SVC model and the epsilon-L2-loss SVR model. The semismooth Newton method is widely used in optimization community. A common belief on the semismooth Newton method is its fast convergence rate as well as high computational complexity. Our contribution in this paper is that by exploring the sparse structure of the models, we significantly reduce the computational complexity, meanwhile keeping the quadratic convergence rate. Extensive numerical experiments demonstrate the outstanding performance of the semismooth Newton method, especially for problems with huge size of sample data (for news20.binary problem with 19996 features and 1355191 samples, it only takes three seconds). In particular, for the epsilon-L2-loss SVR model, the semismooth Newton method significantly outperforms the leading solvers including DCD and TRON.
Recommendations
Cites work
- scientific article; zbMATH DE number 823069 (Why is no real title available?)
- A Quadratically Convergent Newton Method for Computing the Nearest Correlation Matrix
- A Stochastic Approximation Method
- A finite newton method for classification
- A highly efficient semismooth Newton augmented Lagrangian method for solving lasso problems
- A modified finite Newton method for fast solution of large scale linear SVMs
- A nonsmooth version of Newton's method
- A semismooth Newton method for tensor eigenvalue complementarity problem
- A semismooth Newton method for the nearest Euclidean distance matrix problem
- A sequential majorization method for approximating weighted time series of finite rank
- A sequential semismooth Newton method for the nearest low-rank correlation matrix problem
- Coordinate descent method for large-scale L2-loss linear support vector machines
- Estimation of dependences based on empirical data. Transl. from the Russian by Samuel Kotz
- LIBLINEAR: a library for large linear classification
- Large-scale linear support vector regression
- Methods of conjugate gradients for solving linear systems
- On the global convergence of the inexact semi-smooth Newton method for absolute value equation
- On the sparseness of 1-norm support vector machines
- Optimization and nonsmooth analysis
- Optimization methods for large-scale machine learning
- SSVM: A smooth support vector machine for classification
- Semismooth and Semiconvex Functions in Constrained Optimization
- Solving the OSCAR and SLOPE models using a semismooth Newton-based augmented Lagrangian method
- Support-vector networks
- The elements of statistical learning. Data mining, inference, and prediction
- The semismooth Newton method for the solution of quasi-variational inequalities
- Two-Point Step Size Gradient Methods
- Variational Analysis
Cited in
(9)- Non-interior-point smoothing Newton method for CP revisited and its application to support vector machines
- Semismooth support vector machines.
- Recursive Finite Newton Algorithm for Support Vector Regression in the Primal
- A subspace elimination strategy for accelerating support matrix machine
- Analysis of loss functions in support vector machines
- A majorization penalty method for SVM with sparse constraint
- An efficient augmented Lagrangian method for support vector machine
- Multi‐innovation Newton recursive methods for solving the support vector machine regression problems
- Regularized nonsmooth Newton method for multi-class support vector machines
This page was built for publication: A semismooth Newton method for support vector classification and regression
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2419554)