A coordinate-descent primal-dual algorithm with large step size and possibly nonseparable functions
From MaRDI portal
Publication:4646445
Abstract: This paper introduces a coordinate descent version of the V~u-Condat algorithm. By coordinate descent, we mean that only a subset of the coordinates of the primal and dual iterates is updated at each iteration, the other coordinates being maintained to their past value. Our method allows us to solve optimization problems with a combination of differentiable functions, constraints as well as non-separable and non-differentiable regularizers. We show that the sequences generated by our algorithm converge to a saddle point of the problem at stake, for a wider range of parameter values than previous methods. In particular, the condition on the step-sizes depends on the coordinate-wise Lipschitz constant of the differentiable function's gradient, which is a major feature allowing classical coordinate descent to perform so well when it is applicable. We then prove a sublinear rate of convergence in general and a linear rate of convergence if the objective enjoys strong convexity properties. We illustrate the performances of the algorithm on a total-variation regularized least squares regression problem and on large scale support vector machine problems.
Recommendations
- Random Coordinate Descent Methods for Nonseparable Composite Optimization
- Efficiency of coordinate descent methods on huge-scale optimization problems
- An accelerated coordinate gradient descent algorithm for non-separable composite optimization
- A flexible coordinate descent method
- Coordinate descent with arbitrary sampling. I: Algorithms and complexity.
Cites work
- scientific article; zbMATH DE number 3833218 (Why is no real title available?)
- scientific article; zbMATH DE number 51132 (Why is no real title available?)
- scientific article; zbMATH DE number 3574917 (Why is no real title available?)
- scientific article; zbMATH DE number 3449561 (Why is no real title available?)
- A Coordinate Descent Primal-Dual Algorithm and Application to Distributed Asynchronous Optimization
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- A class of randomized primal-dual algorithms for distributed optimization
- A coordinate gradient descent method for linearly constrained smooth optimization and support vector machines training
- A coordinate gradient descent method for nonsmooth separable minimization
- A dual algorithm for the solution of nonlinear variational problems via finite element approximation
- A first-order primal-dual algorithm for convex problems with applications to imaging
- A primal-dual splitting method for convex optimization involving Lipschitzian, proximable and linear composite terms
- A random coordinate descent algorithm for optimization problems with composite objective function and linear coupled constraints
- A smooth primal-dual optimization framework for nonsmooth composite convex minimization
- A splitting algorithm for dual monotone inclusions involving cocoercive operators
- A three-operator splitting scheme and its optimization applications
- Accelerated, parallel, and proximal coordinate descent
- Algorithm 778: L-BFGS-B
- An inexact proximal path-following algorithm for constrained convex minimization
- An introduction to total variation for image analysis
- Bregman Monotone Optimization Algorithms
- Convergence analysis of primal-dual algorithms for a saddle-point problem: from contraction perspective
- Convergence of a block coordinate descent method for nondifferentiable minimization
- Distributed optimization and statistical learning via the alternating direction method of multipliers
- Efficiency of coordinate descent methods on huge-scale optimization problems
- Efficient serial and parallel coordinate descent methods for huge-scale truss topology design
- Faster convergence rates of relaxed Peaceman-Rachford and ADMM under regularity assumptions
- First-order algorithms for convex optimization with nonseparable objective and coupled constraints
- Incremental majorization-minimization optimization with application to large-scale machine learning
- Incremental proximal methods for large scale convex optimization
- Iteration complexity analysis of block coordinate descent methods
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Minimizing Certain Convex Functions
- On the convergence of block coordinate descent type methods
- On the convergence of the coordinate descent method for convex differentiable minimization
- On the convergence of the iterates of the ``fast iterative shrinkage/thresholding algorithm
- On the ergodic convergence rates of a first-order primal-dual algorithm
- Parallel coordinate descent methods for big data optimization
- Pathwise coordinate optimization
- Randomized primal-dual proximal block coordinate updates
- Stochastic Primal-Dual Hybrid Gradient Algorithm with Arbitrary Sampling and Imaging Applications
- Stochastic dual coordinate ascent methods for regularized loss minimization
- Stochastic quasi-Fejér block-coordinate fixed point iterations with random sweeping
- Subgradient methods for huge-scale optimization problems
Cited in
(28)- Coordinate Descent Face-Off: Primal or Dual?
- Convergence properties of a randomized primal-dual algorithm with applications to parallel MRI
- Block-coordinate and incremental aggregated proximal gradient methods for nonsmooth nonconvex problems
- Linear Convergence of Random Dual Coordinate Descent on Nonpolyhedral Convex Problems
- Multi-block Bregman proximal alternating linearized minimization and its application to orthogonal nonnegative matrix factorization
- Block-wise primal-dual algorithms for large-scale doubly penalized ANOVA modeling
- Primal-dual block-proximal splitting for a class of non-convex problems
- A generic coordinate descent solver for non-smooth convex optimisation
- Linear convergence of randomized feasible descent methods under the weak strong convexity assumption
- Quadratic error bound of the smoothed gap and the restarted averaged primal-dual hybrid gradient
- A flexible coordinate descent method
- Cyclic coordinate-update algorithms for fixed-point problems: analysis and applications
- An accelerated coordinate gradient descent algorithm for non-separable composite optimization
- Parallel random block-coordinate forward-backward algorithm: a unified convergence analysis
- Cyclic Coordinate Dual Averaging with Extrapolation
- On the convergence of stochastic primal-dual hybrid gradient
- scientific article; zbMATH DE number 6962012 (Why is no real title available?)
- On the complexity analysis of the primal solutions for the accelerated randomized dual coordinate ascent
- Non-ergodic convergence rate of an inertial accelerated primal-dual algorithm for saddle point problems
- A block successive upper-bound minimization method of multipliers for linearly constrained convex optimization
- Stochastic primal-dual coordinate method for regularized empirical risk minimization
- Random Coordinate Descent Methods for Nonseparable Composite Optimization
- Dual randomized coordinate descent method for solving a class of nonconvex problems
- Practical acceleration of the Condat-Vũ algorithm
- Acceleration of the PDHGM on partially strongly convex functions
- An alternative extrapolation scheme of PDHGM for saddle point problem with nonlinear function
- A new large-scale learning algorithm for generalized additive models
- Distributed composite optimization for multi-agent systems with asynchrony
This page was built for publication: A coordinate-descent primal-dual algorithm with large step size and possibly nonseparable functions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4646445)