A coordinate gradient descent method for linearly constrained smooth optimization and support vector machines training
From MaRDI portal
Publication:711381
DOI10.1007/S10589-008-9215-4zbMath1226.90062OpenAlexW2085650911MaRDI QIDQ711381
Publication date: 26 October 2010
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10589-008-9215-4
global convergenceerror boundlinear constraintsquadratic programsupport vector machinelinear convergence ratecoordinate gradient descentconformal realizationcontinuous quadratic knapsack problem
Related Items (23)
A method of bi-coordinate variations with tolerances and its convergence ⋮ Further properties of the forward-backward envelope with applications to difference-of-convex programming ⋮ Asymmetric \(\nu\)-tube support vector regression ⋮ Block-coordinate and incremental aggregated proximal gradient methods for nonsmooth nonconvex problems ⋮ A flexible coordinate descent method ⋮ On the global convergence rate of the gradient descent method for functions with Hölder continuous gradients ⋮ Supervised classification and mathematical optimization ⋮ Approximation accuracy, gradient methods, and error bound for structured convex optimization ⋮ An almost cyclic 2-coordinate descent method for singly linearly constrained problems ⋮ Asynchronous Stochastic Coordinate Descent: Parallelism and Convergence Properties ⋮ Linear Convergence of Proximal Gradient Algorithm with Extrapolation for a Class of Nonconvex Nonsmooth Minimization Problems ⋮ A random coordinate descent algorithm for optimization problems with composite objective function and linear coupled constraints ⋮ The 2-coordinate descent method for solving double-sided simplex constrained minimization problems ⋮ A Coordinate-Descent Primal-Dual Algorithm with Large Step Size and Possibly Nonseparable Functions ⋮ Primal and dual predicted decrease approximation methods ⋮ Robust multicategory support vector machines using difference convex algorithm ⋮ Two smooth support vector machines for \(\varepsilon \)-insensitive regression ⋮ Nonconvex proximal incremental aggregated gradient method with linear convergence ⋮ Parallel decomposition methods for linearly constrained problems subject to simple bound with application to the SVMs training ⋮ A Coordinate Wise Variational Method with Tolerance Functions ⋮ A unified convergence framework for nonmonotone inexact decomposition methods ⋮ Block-coordinate gradient descent method for linearly constrained nonsmooth separable optimization ⋮ Accelerated iterative hard thresholding algorithm for \(l_0\) regularized regression problem
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Semismooth support vector machines.
- An O(n) algorithm for quadratic knapsack problems
- A coordinate gradient descent method for nonsmooth separable minimization
- Decomposition algorithm model for singly linearly-constrained problems subject to lower and Upper bounds
- Estimation of dependences based on empirical data. Transl. from the Russian by Samuel Kotz
- Error bounds and convergence analysis of feasible descent methods: A general approach
- Polynomial-time decomposition algorithms for support vector machines
- On linear-time algorithms for the continuous quadratic Knapsack problem
- A convergent decomposition algorithm for support vector machines
- 10.1162/15324430260185619
- Error Bound and Convergence Analysis of Matrix Splitting Algorithms for the Affine Variational Inequality Problem
- Numerical Optimization
- SMO Algorithm for Least-Squares SVM Formulations
- Interior-Point Methods for Massive Support Vector Machines
- Downlink beamforming for DS-CDMA mobile radio with multimedia services
- Learning Theory
- Improvements to Platt's SMO Algorithm for SVM Classifier Design
- On the convergence of a modified version of SVMlightalgorithm
- Algorithmic Learning Theory
- Learning Theory
- Convergence of a generalized SMO algorithm for SVM classifier design
This page was built for publication: A coordinate gradient descent method for linearly constrained smooth optimization and support vector machines training