On Faster Convergence of Cyclic Block Coordinate Descent-type Methods for Strongly Convex Minimization
From MaRDI portal
Publication:4558510
zbMath1404.68112arXiv1607.02793MaRDI QIDQ4558510
Xing Guo Li, Han Liu, Tuo Zhao, Raman Arora, Mingyi Hong
Publication date: 22 November 2018
Full work available at URL: https://arxiv.org/abs/1607.02793
gradient descentquadratic minimizationcyclic block coordinate descentimproved iteration complexitystrongly convex minimization
Related Items (9)
Cyclic Coordinate Dual Averaging with Extrapolation ⋮ Block Policy Mirror Descent ⋮ Robust supervised learning with coordinate gradient descent ⋮ Local linear convergence of proximal coordinate descent algorithm ⋮ Analyzing random permutations for cyclic coordinate descent ⋮ Inexact variable metric stochastic block-coordinate descent for regularized optimization ⋮ Pathwise coordinate optimization for sparse learning: algorithm and theory ⋮ Two Symmetrized Coordinate Descent Methods Can Be $O(n^2)$ Times Slower Than the Randomized Version ⋮ New analysis of linear convergence of gradient-type methods via unifying error bound conditions
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- On the complexity analysis of randomized block-coordinate descent methods
- On the linear convergence of the alternating direction method of multipliers
- Iteration complexity analysis of block coordinate descent methods
- A coordinate gradient descent method for nonsmooth separable minimization
- Triangular truncation and finding the norm of a Hadamard multiplier
- A dual algorithm for the solution of nonlinear variational problems via finite element approximation
- Introductory lectures on convex optimization. A basic course.
- Pathwise coordinate optimization for sparse learning: algorithm and theory
- On non-ergodic convergence rate of Douglas-Rachford alternating direction method of multipliers
- Dual coordinate ascent methods for non-strictly convex minimization
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Pathwise coordinate optimization
- A Unified Convergence Analysis of Block Successive Minimization Methods for Nonsmooth Optimization
- On the Iteration Complexity of Cyclic Coordinate Gradient Descent Methods
- NESTA: A Fast and Accurate First-Order Method for Sparse Recovery
- SparseNet: Coordinate Descent With Nonconvex Penalties
- Matrix Analysis
- On the Linear Convergence of Descent Methods for Convex Essentially Smooth Minimization
- On the Nonasymptotic Convergence of Cyclic Coordinate Descent Methods
- Regularization and Variable Selection Via the Elastic Net
- Fast Gradient-Based Algorithms for Constrained Total Variation Image Denoising and Deblurring Problems
- The huge Package for High-dimensional Undirected Graph Estimation in R
- On the Convergence of Block Coordinate Descent Type Methods
- Strong Rules for Discarding Predictors in Lasso-Type Problems
- Convergence of a block coordinate descent method for nondifferentiable minimization
- Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization
This page was built for publication: On Faster Convergence of Cyclic Block Coordinate Descent-type Methods for Strongly Convex Minimization