The Cyclic Block Conditional Gradient Method for Convex Optimization Problems
From MaRDI portal
Publication:3449572
DOI10.1137/15M1008397zbMath1327.90193arXiv1502.03716OpenAlexW2136106887MaRDI QIDQ3449572
Edouard Pauwels, Amir Beck, Shoham Sabach
Publication date: 4 November 2015
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1502.03716
support vector machineiteration complexitynonsmooth convex minimizationconditional gradientcyclic block decompositionlinear oracle
Related Items
Block coordinate type methods for optimization and learning, An accelerated coordinate gradient descent algorithm for non-separable composite optimization, An unexpected connection between Bayes \(A\)-optimal designs and the group Lasso, Frank-Wolfe and friends: a journey into projection-free first-order optimization methods, Primal and dual predicted decrease approximation methods, Multi-block Bregman proximal alternating linearized minimization and its application to orthogonal nonnegative matrix factorization, A block inertial Bregman proximal algorithm for nonsmooth nonconvex problems with application to symmetric nonnegative matrix tri-factorization, Generalized Conditional Gradient with Augmented Lagrangian for Composite Minimization
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Gradient methods for minimizing composite functions
- Dual subgradient algorithms for large-scale nonsmooth learning problems
- Conditional gradient algorithms for norm-regularized smooth convex optimization
- A generalized conditional gradient method and its connection to an iterative shrinkage method
- Conditional gradient algorithms with open loop step size rules
- On the convergence of the coordinate descent method for convex differentiable minimization
- A conditional gradient method with linear rate of convergence for solving convex linear systems
- Mirror descent and nonlinear projected subgradient methods for convex optimization.
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Duality Between Subgradient and Conditional Gradient Methods
- On the Convergence of Alternating Minimization for Convex Programming with Applications to Iteratively Reweighted Least Squares and Decomposition Schemes
- Iterated Hard Shrinkage for Minimization Problems with Sparsity Constraints
- Some comments on Wolfe's ‘away step’
- On the Nonasymptotic Convergence of Cyclic Coordinate Descent Methods
- Stochastic Dual Coordinate Ascent Methods for Regularized Loss Minimization
- On the Convergence of Block Coordinate Descent Type Methods
- A Tight Upper Bound on the Rate of Convergence of Frank-Wolfe Algorithm