Local linear convergence of proximal coordinate descent algorithm
From MaRDI portal
Publication:6181368
DOI10.1007/s11590-023-01976-zOpenAlexW4353084616MaRDI QIDQ6181368
Samuel Vaiter, Quentin Bertrand, Alexandre Gramfort, Joseph Salmon, Quentin Klopfenstein
Publication date: 22 January 2024
Published in: Optimization Letters (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11590-023-01976-z
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A random coordinate descent algorithm for optimization problems with composite objective function and linear coupled constraints
- Iteration complexity analysis of block coordinate descent methods
- Nonsmoothness in machine learning: specific structure, proximal identification, and applications
- Block-coordinate gradient descent method for linearly constrained nonsmooth separable optimization
- On the convergence of the coordinate descent method for convex differentiable minimization
- A globally convergent algorithm for nonconvex optimization based on block coordinate update
- Relating \(\ell_p\) regularization and reweighted \(\ell_1\) regularization
- ``Active-set complexity of proximal gradient: how long does it take to find the sparsity pattern?
- Linear convergence of first order methods for non-strongly convex optimization
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Pathwise coordinate optimization
- Convergence rate analysis of proximal iteratively reweighted \(\ell_1\) methods for \(\ell_p\) regularization problems
- Coordinate descent with arbitrary sampling I: algorithms and complexity†
- Coordinate descent with arbitrary sampling II: expected separable overapproximation
- A Unified Convergence Analysis of Block Successive Minimization Methods for Nonsmooth Optimization
- Identifying Active Manifolds in Regularization Problems
- Accelerated Block-coordinate Relaxation for Regularized Optimization
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Local Linear Convergence of ISTA and FISTA on the LASSO Problem
- Activity Identification and Local Linear Convergence of Forward--Backward-type Methods
- Identifiable Surfaces in Constrained Optimization
- Randomized Methods for Linear Constraints: Convergence Rates and Conditioning
- Accelerated, Parallel, and Proximal Coordinate Descent
- On the Identification of Active Constraints
- Splitting Algorithms for the Sum of Two Nonlinear Operators
- On the Goldstein-Levitin-Polyak gradient projection method
- Atomic Decomposition by Basis Pursuit
- Sensitivity Analysis for Mirror-Stratifiable Convex Functions
- On Faster Convergence of Cyclic Block Coordinate Descent-type Methods for Strongly Convex Minimization
- Model Consistency of Partly Smooth Regularizers
- Model selection with low complexity priors
- Active Sets, Nonsmoothness, and Sensitivity
- Prox-regular functions in variational analysis
- On the Nonasymptotic Convergence of Cyclic Coordinate Descent Methods
- Regularization and Variable Selection Via the Elastic Net
- Stochastic Dual Coordinate Ascent Methods for Regularized Loss Minimization
- Signal Recovery by Proximal Forward-Backward Splitting
- Convergence of a block coordinate descent method for nondifferentiable minimization