An accelerated coordinate gradient descent algorithm for non-separable composite optimization
From MaRDI portal
Publication:2139254
DOI10.1007/S10957-021-01957-1zbMATH Open1492.90119OpenAlexW3217620259MaRDI QIDQ2139254FDOQ2139254
Publication date: 17 May 2022
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10957-021-01957-1
Recommendations
- Random Coordinate Descent Methods for Nonseparable Composite Optimization
- A coordinate gradient descent method for nonsmooth nonseparable minimization
- A coordinate gradient descent method for nonsmooth separable minimization
- A coordinate-descent primal-dual algorithm with large step size and possibly nonseparable functions
- Gradient methods for minimizing composite functions
Cites Work
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Pathwise coordinate optimization
- Gradient methods for minimizing composite functions
- First-Order Methods in Optimization
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- An iterative thresholding algorithm for linear inverse problems with a sparsity constraint
- Proximité et dualité dans un espace hilbertien
- Efficiency of coordinate descent methods on huge-scale optimization problems
- Accelerated, Parallel, and Proximal Coordinate Descent
- Stochastic Quasi-Fejér Block-Coordinate Fixed Point Iterations with Random Sweeping
- Coordinate descent algorithms
- On the Convergence of Block Coordinate Descent Type Methods
- Interior Gradient and Proximal Methods for Convex and Conic Optimization
- An \(O(n)\) algorithm for projecting a vector on the intersection of a hyperplane and a box in \(\mathbb R^n\)
- Smoothing and first order methods: a unified framework
- Total Variation on a Tree
- Iteration complexity analysis of block coordinate descent methods
- The Cyclic Block Conditional Gradient Method for Convex Optimization Problems
- A Coordinate-Descent Primal-Dual Algorithm with Large Step Size and Possibly Nonseparable Functions
- Forward-backward quasi-Newton methods for nonsmooth optimization problems
- Envelope functions: unifications and further properties
- Modular proximal optimization for multidimensional total-variation regularization
Cited In (6)
- A generic coordinate descent solver for non-smooth convex optimisation
- Interpolation conditions for linear operators and applications to performance estimation problems
- Accelerated inexact composite gradient methods for nonconvex spectral optimization problems
- Accelerated randomized mirror descent algorithms for composite non-strongly convex optimization
- Coordinate descent methods beyond smoothness and separability
- Random Coordinate Descent Methods for Nonseparable Composite Optimization
Uses Software
This page was built for publication: An accelerated coordinate gradient descent algorithm for non-separable composite optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2139254)