Random Coordinate Descent Methods for Nonseparable Composite Optimization
From MaRDI portal
Publication:6176428
DOI10.1137/22M148700XzbMath1522.90088arXiv2203.14368MaRDI QIDQ6176428
Publication date: 23 August 2023
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2203.14368
convergence ratescomposite minimizationadaptive stepsizerandom coordinate descentnonseparable objective function
Numerical mathematical programming methods (65K05) Convex programming (90C25) Stochastic programming (90C15)
Related Items (3)
Random Coordinate Descent Methods for Nonseparable Composite Optimization ⋮ Distributed Proximal Gradient Algorithm for Partially Asynchronous Computer Clusters ⋮ Coordinate descent methods beyond smoothness and separability
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Proximal alternating linearized minimization for nonconvex and nonsmooth problems
- On the complexity analysis of randomized block-coordinate descent methods
- Accelerating the cubic regularization of Newton's method on convex problems
- Block-coordinate gradient descent method for linearly constrained nonsmooth separable optimization
- On the convergence of the block nonlinear Gauss-Seidel method under convex constraints
- Multi-block Bregman proximal alternating linearized minimization and its application to orthogonal nonnegative matrix factorization
- Block-coordinate and incremental aggregated proximal gradient methods for nonsmooth nonconvex problems
- An accelerated coordinate gradient descent algorithm for non-separable composite optimization
- Coordinate descent algorithms
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Pathwise coordinate optimization
- Cubic regularization of Newton method and its global performance
- Minimization of functions having Lipschitz continuous first partial derivatives
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- On the Convergence of Alternating Minimization for Convex Programming with Applications to Iteratively Reweighted Least Squares and Decomposition Schemes
- Efficiency of the Accelerated Coordinate Descent Method on Structured Optimization Problems
- The university of Florida sparse matrix collection
- Inexact block coordinate descent methods with application to non-negative matrix factorization
- Accelerated, Parallel, and Proximal Coordinate Descent
- Parallel Random Coordinate Descent Method for Composite Minimization: Convergence Analysis and Error Bounds
- Relatively Smooth Convex Optimization by First-Order Methods, and Applications
- Proximal Gradient Methods with Adaptive Subspace Sampling
- Block Bregman Majorization Minimization with Extrapolation
- Inexact basic tensor methods for some classes of convex optimization problems
- Randomized sketch descent methods for non-separable linearly constrained optimization
- Gradient Descent Finds the Cubic-Regularized Nonconvex Newton Step
- Random Coordinate Descent Algorithms for Multi-Agent Convex Optimization Over Networks
- On the Convergence of Block Coordinate Descent Type Methods
- The Łojasiewicz Inequality for Nonsmooth Subanalytic Functions with Applications to Subgradient Dynamical Systems
- On search directions for minimization algorithms
- Random Coordinate Descent Methods for Nonseparable Composite Optimization
This page was built for publication: Random Coordinate Descent Methods for Nonseparable Composite Optimization