A semismooth Newton method for support vector classification and regression
DOI10.1007/S10589-019-00075-ZzbMATH Open1423.90276arXiv1903.00249OpenAlexW2911445670WikidataQ128393146 ScholiaQ128393146MaRDI QIDQ2419554FDOQ2419554
Authors: Juan Yin, Qing-Na Li
Publication date: 13 June 2019
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1903.00249
Recommendations
generalized Jacobianquadratic convergencesemismooth Newton methodsupport vector regressionsupport vector classification
Cites Work
- LIBLINEAR: a library for large linear classification
- The elements of statistical learning. Data mining, inference, and prediction
- SSVM: A smooth support vector machine for classification
- Support-vector networks
- Variational Analysis
- Title not available (Why is that?)
- A Stochastic Approximation Method
- Two-Point Step Size Gradient Methods
- A Quadratically Convergent Newton Method for Computing the Nearest Correlation Matrix
- Methods of conjugate gradients for solving linear systems
- A nonsmooth version of Newton's method
- Optimization and nonsmooth analysis
- Semismooth and Semiconvex Functions in Constrained Optimization
- A modified finite Newton method for fast solution of large scale linear SVMs
- Coordinate descent method for large-scale L2-loss linear support vector machines
- Estimation of dependences based on empirical data. Transl. from the Russian by Samuel Kotz
- A sequential semismooth Newton method for the nearest low-rank correlation matrix problem
- On the global convergence of the inexact semi-smooth Newton method for absolute value equation
- A semismooth Newton method for tensor eigenvalue complementarity problem
- A finite newton method for classification
- The semismooth Newton method for the solution of quasi-variational inequalities
- On the sparseness of 1-norm support vector machines
- A semismooth Newton method for the nearest Euclidean distance matrix problem
- Large-scale linear support vector regression
- Optimization methods for large-scale machine learning
- A highly efficient semismooth Newton augmented Lagrangian method for solving lasso problems
- A sequential majorization method for approximating weighted time series of finite rank
- Solving the OSCAR and SLOPE models using a semismooth Newton-based augmented Lagrangian method
Cited In (8)
- A subspace elimination strategy for accelerating support matrix machine
- Regularized nonsmooth Newton method for multi-class support vector machines
- Analysis of loss functions in support vector machines
- Semismooth support vector machines.
- Recursive Finite Newton Algorithm for Support Vector Regression in the Primal
- An efficient augmented Lagrangian method for support vector machine
- Non-interior-point smoothing Newton method for CP revisited and its application to support vector machines
- A majorization penalty method for SVM with sparse constraint
Uses Software
This page was built for publication: A semismooth Newton method for support vector classification and regression
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2419554)