On the Global Convergence of Randomized Coordinate Gradient Descent for Nonconvex Optimization
From MaRDI portal
Publication:6158001
DOI10.1137/21m1460375zbMath1519.90184arXiv2101.01323MaRDI QIDQ6158001
Jian-feng Lu, Yingzhou Li, Ziang Chen
Publication date: 22 June 2023
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2101.01323
Nonconvex programming, global optimization (90C26) Dynamical systems in optimization and economics (37N40)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Parallel coordinate descent methods for big data optimization
- Proximal alternating linearized minimization for nonconvex and nonsmooth problems
- Smooth center manifolds for random dynamical systems
- Random reordering in SOR-type methods
- A proof of Oseledec's multiplicative ergodic theorem
- Takens theorem for random dynamical systems
- A Siegel theorem for dynamical systems under random perturbations
- A coordinate gradient descent method for nonsmooth separable minimization
- A stochastic version of center manifold theory
- Ergodic theory of differentiable dynamical systems
- Characteristic exponents and invariant manifolds in Hilbert space
- A geometric analysis of phase retrieval
- Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods
- Randomness and permutations in coordinate descent methods
- Coordinate descent algorithms
- First-order methods almost always avoid strict saddle points
- Behavior of accelerated gradient methods near critical points of nonconvex functions
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Normally hyperbolic invariant manifolds for random dynamical systems: Part I - persistence
- Accelerated Block-coordinate Relaxation for Regularized Optimization
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Asynchronous Stochastic Coordinate Descent: Parallelism and Convergence Properties
- Sternberg theorems for random dynamical systems
- Proximal Alternating Minimization and Projection Methods for Nonconvex Problems: An Approach Based on the Kurdyka-Łojasiewicz Inequality
- Stochastic Stability of Lyapunov Exponents and Oseledets Splittings for Semi‐invertible Matrix Cocycles
- Lyapunov exponents and invariant manifolds for random dynamical systems in a Banach space
- Stability of Lyapunov exponents
- A Dynamical Proof of the Multiplicative Ergodic Theorem
- On Nonconvex Optimization for Machine Learning
- Analyzing random permutations for cyclic coordinate descent
- CoordinateWise Descent Methods for Leading Eigenvalue Problem
- On the Nonasymptotic Convergence of Cyclic Coordinate Descent Methods
- On the Convergence of Block Coordinate Descent Type Methods
- An Asynchronous Parallel Stochastic Coordinate Descent Algorithm
- Poincaré theorems for random dynamical systems
- Random permutations fix a worst case for cyclic coordinate descent
- Inertial Proximal Block Coordinate Method for a Class of Nonsmooth Sum-of-Ratios Optimization Problems