A coordinate-descent primal-dual algorithm with large step size and possibly nonseparable functions
DOI10.1137/18M1168480zbMATH Open1411.90265arXiv1508.04625WikidataQ128616281 ScholiaQ128616281MaRDI QIDQ4646445FDOQ4646445
Authors: Olivier Fercoq, Pascal Bianchi
Publication date: 14 January 2019
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1508.04625
Recommendations
- Random Coordinate Descent Methods for Nonseparable Composite Optimization
- Efficiency of coordinate descent methods on huge-scale optimization problems
- An accelerated coordinate gradient descent algorithm for non-separable composite optimization
- A flexible coordinate descent method
- Coordinate descent with arbitrary sampling. I: Algorithms and complexity.
Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Discrete approximations in optimal control (49M25)
Cites Work
- Algorithm 778: L-BFGS-B
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Pathwise coordinate optimization
- Convergence of a block coordinate descent method for nondifferentiable minimization
- Distributed optimization and statistical learning via the alternating direction method of multipliers
- A dual algorithm for the solution of nonlinear variational problems via finite element approximation
- Title not available (Why is that?)
- A coordinate gradient descent method for nonsmooth separable minimization
- Incremental majorization-minimization optimization with application to large-scale machine learning
- A first-order primal-dual algorithm for convex problems with applications to imaging
- Title not available (Why is that?)
- An introduction to total variation for image analysis
- Title not available (Why is that?)
- Bregman Monotone Optimization Algorithms
- A splitting algorithm for dual monotone inclusions involving cocoercive operators
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Parallel coordinate descent methods for big data optimization
- Efficiency of coordinate descent methods on huge-scale optimization problems
- Convergence analysis of primal-dual algorithms for a saddle-point problem: from contraction perspective
- Accelerated, parallel, and proximal coordinate descent
- A primal-dual splitting method for convex optimization involving Lipschitzian, proximable and linear composite terms
- Subgradient methods for huge-scale optimization problems
- A random coordinate descent algorithm for optimization problems with composite objective function and linear coupled constraints
- Efficient serial and parallel coordinate descent methods for huge-scale truss topology design
- A class of randomized primal-dual algorithms for distributed optimization
- Title not available (Why is that?)
- Stochastic quasi-Fejér block-coordinate fixed point iterations with random sweeping
- A coordinate gradient descent method for linearly constrained smooth optimization and support vector machines training
- Stochastic dual coordinate ascent methods for regularized loss minimization
- On the convergence of block coordinate descent type methods
- Incremental proximal methods for large scale convex optimization
- On the ergodic convergence rates of a first-order primal-dual algorithm
- A three-operator splitting scheme and its optimization applications
- On the convergence of the iterates of the ``fast iterative shrinkage/thresholding algorithm
- On the convergence of the coordinate descent method for convex differentiable minimization
- Iteration complexity analysis of block coordinate descent methods
- An inexact proximal path-following algorithm for constrained convex minimization
- Minimizing Certain Convex Functions
- Faster convergence rates of relaxed Peaceman-Rachford and ADMM under regularity assumptions
- Stochastic Primal-Dual Hybrid Gradient Algorithm with Arbitrary Sampling and Imaging Applications
- Randomized primal-dual proximal block coordinate updates
- First-order algorithms for convex optimization with nonseparable objective and coupled constraints
- A Coordinate Descent Primal-Dual Algorithm and Application to Distributed Asynchronous Optimization
- A smooth primal-dual optimization framework for nonsmooth composite convex minimization
Cited In (28)
- Convergence properties of a randomized primal-dual algorithm with applications to parallel MRI
- Linear Convergence of Random Dual Coordinate Descent on Nonpolyhedral Convex Problems
- Block-coordinate and incremental aggregated proximal gradient methods for nonsmooth nonconvex problems
- Block-wise primal-dual algorithms for large-scale doubly penalized ANOVA modeling
- Multi-block Bregman proximal alternating linearized minimization and its application to orthogonal nonnegative matrix factorization
- Primal-dual block-proximal splitting for a class of non-convex problems
- A generic coordinate descent solver for non-smooth convex optimisation
- Linear convergence of randomized feasible descent methods under the weak strong convexity assumption
- Quadratic error bound of the smoothed gap and the restarted averaged primal-dual hybrid gradient
- Cyclic coordinate-update algorithms for fixed-point problems: analysis and applications
- A flexible coordinate descent method
- An accelerated coordinate gradient descent algorithm for non-separable composite optimization
- Cyclic Coordinate Dual Averaging with Extrapolation
- Parallel random block-coordinate forward-backward algorithm: a unified convergence analysis
- On the convergence of stochastic primal-dual hybrid gradient
- Title not available (Why is that?)
- On the complexity analysis of the primal solutions for the accelerated randomized dual coordinate ascent
- Non-ergodic convergence rate of an inertial accelerated primal-dual algorithm for saddle point problems
- A block successive upper-bound minimization method of multipliers for linearly constrained convex optimization
- Stochastic primal-dual coordinate method for regularized empirical risk minimization
- Random Coordinate Descent Methods for Nonseparable Composite Optimization
- Dual randomized coordinate descent method for solving a class of nonconvex problems
- Practical acceleration of the Condat-Vũ algorithm
- Acceleration of the PDHGM on partially strongly convex functions
- A new large-scale learning algorithm for generalized additive models
- An alternative extrapolation scheme of PDHGM for saddle point problem with nonlinear function
- Distributed composite optimization for multi-agent systems with asynchrony
- Coordinate Descent Face-Off: Primal or Dual?
Uses Software
This page was built for publication: A coordinate-descent primal-dual algorithm with large step size and possibly nonseparable functions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4646445)