An Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization
Publication:3451763
DOI10.1137/141000270zbMath1329.65127arXiv1407.1296OpenAlexW2163786124MaRDI QIDQ3451763
Qihang Lin, Zhaosong Lu, Lin Xiao
Publication date: 18 November 2015
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1407.1296
convex optimizationrandomized algorithmempirical risk minimizationaccelerated proximal gradient methodcoordinate descent method
Numerical mathematical programming methods (65K05) Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Complexity and performance of numerical algorithms (65Y20)
Related Items (37)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Parallel coordinate descent methods for big data optimization
- Gradient methods for minimizing composite functions
- A random coordinate descent algorithm for optimization problems with composite objective function and linear coupled constraints
- Efficient random coordinate descent algorithms for large-scale structured nonconvex optimization
- On the complexity analysis of randomized block-coordinate descent methods
- Minimizing finite sums with the stochastic average gradient
- Iteration complexity analysis of block coordinate descent methods
- A coordinate gradient descent method for nonsmooth separable minimization
- Block-coordinate gradient descent method for linearly constrained nonsmooth separable optimization
- On the convergence of the coordinate descent method for convex differentiable minimization
- Introductory lectures on convex optimization. A basic course.
- Coordinate descent optimization for \(l^{1}\) minimization with application to compressed sensing; a greedy algorithm
- Efficient block-coordinate descent algorithms for the group Lasso
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Coordinate descent algorithms for lasso penalized regression
- Block Coordinate Descent Methods for Semidefinite Programming
- Distributed Coordinate Descent Method for Learning with Big Data
- Accelerated Block-coordinate Relaxation for Regularized Optimization
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Randomized Methods for Linear Constraints: Convergence Rates and Conditioning
- Accelerated, Parallel, and Proximal Coordinate Descent
- An accelerated randomized Kaczmarz algorithm
- Parallel Random Coordinate Descent Method for Composite Minimization: Convergence Analysis and Error Bounds
- A Proximal Stochastic Gradient Method with Progressive Variance Reduction
- On the Nonasymptotic Convergence of Cyclic Coordinate Descent Methods
- Stochastic Dual Coordinate Ascent Methods for Regularized Loss Minimization
- On the Convergence of Block Coordinate Descent Type Methods
- An Asynchronous Parallel Stochastic Coordinate Descent Algorithm
- Convex Analysis
- Convergence of a block coordinate descent method for nondifferentiable minimization
- Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization
This page was built for publication: An Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization