An Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization
From MaRDI portal
Publication:3451763
DOI10.1137/141000270zbMath1329.65127arXiv1407.1296OpenAlexW2163786124MaRDI QIDQ3451763
Qihang Lin, Zhaosong Lu, Lin Xiao
Publication date: 18 November 2015
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1407.1296
convex optimizationrandomized algorithmempirical risk minimizationaccelerated proximal gradient methodcoordinate descent method
Numerical mathematical programming methods (65K05) Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Complexity and performance of numerical algorithms (65Y20)
Related Items
A penalty method for rank minimization problems in symmetric matrices, Block-coordinate and incremental aggregated proximal gradient methods for nonsmooth nonconvex problems, Accelerated, Parallel, and Proximal Coordinate Descent, On the Complexity Analysis of the Primal Solutions for the Accelerated Randomized Dual Coordinate Ascent, MAGMA: Multilevel Accelerated Gradient Mirror Descent Algorithm for Large-Scale Convex Composite Minimization, Unnamed Item, A Subspace Acceleration Method for Minimization Involving a Group Sparsity-Inducing Regularizer, Accelerated First-Order Methods for Convex Optimization with Locally Lipschitz Continuous Gradient, An aggressive reduction on the complexity of optimization for non-strongly convex objectives, Cyclic Coordinate Dual Averaging with Extrapolation, Randomized Block Proximal Damped Newton Method for Composite Self-Concordant Minimization, Iteration-Complexity of First-Order Augmented Lagrangian Methods for Convex Conic Programming, Worst-case complexity of cyclic coordinate descent: \(O(n^2)\) gap with randomized version, A Randomized Coordinate Descent Method with Volume Sampling, Two Symmetrized Coordinate Descent Methods Can Be $O(n^2)$ Times Slower Than the Randomized Version, An Optimal Algorithm for Decentralized Finite-Sum Optimization, Catalyst Acceleration for First-order Convex Optimization: from Theory to Practice, An introduction to continuous optimization for imaging, Generalized stochastic Frank-Wolfe algorithm with stochastic ``substitute gradient for structured convex optimization, Unnamed Item, Unnamed Item, Acceleration of primal-dual methods by preconditioning and simple subproblem procedures, An optimal randomized incremental gradient method, A parallel line search subspace correction method for composite convex optimization, On maximum residual block and two-step Gauss-Seidel algorithms for linear least-squares problems, An Efficient Inexact ABCD Method for Least Squares Semidefinite Programming, A remark on accelerated block coordinate descent for computing the proximity operators of a sum of convex functions, Restarting the accelerated coordinate descent method with a rough strong convexity estimate, Coordinate descent with arbitrary sampling I: algorithms and complexity†, Coordinate descent with arbitrary sampling II: expected separable overapproximation, CoordinateWise Descent Methods for Leading Eigenvalue Problem, Optimization in High Dimensions via Accelerated, Parallel, and Proximal Coordinate Descent, Stochastic Primal-Dual Coordinate Method for Regularized Empirical Risk Minimization, Unnamed Item, Unnamed Item, An adaptive Polyak heavy-ball method, Coordinate descent algorithms
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Parallel coordinate descent methods for big data optimization
- Gradient methods for minimizing composite functions
- A random coordinate descent algorithm for optimization problems with composite objective function and linear coupled constraints
- Efficient random coordinate descent algorithms for large-scale structured nonconvex optimization
- On the complexity analysis of randomized block-coordinate descent methods
- Minimizing finite sums with the stochastic average gradient
- Iteration complexity analysis of block coordinate descent methods
- A coordinate gradient descent method for nonsmooth separable minimization
- Block-coordinate gradient descent method for linearly constrained nonsmooth separable optimization
- On the convergence of the coordinate descent method for convex differentiable minimization
- Introductory lectures on convex optimization. A basic course.
- Coordinate descent optimization for \(l^{1}\) minimization with application to compressed sensing; a greedy algorithm
- Efficient block-coordinate descent algorithms for the group Lasso
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Coordinate descent algorithms for lasso penalized regression
- Block Coordinate Descent Methods for Semidefinite Programming
- Distributed Coordinate Descent Method for Learning with Big Data
- Accelerated Block-coordinate Relaxation for Regularized Optimization
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Randomized Methods for Linear Constraints: Convergence Rates and Conditioning
- Accelerated, Parallel, and Proximal Coordinate Descent
- An accelerated randomized Kaczmarz algorithm
- Parallel Random Coordinate Descent Method for Composite Minimization: Convergence Analysis and Error Bounds
- A Proximal Stochastic Gradient Method with Progressive Variance Reduction
- On the Nonasymptotic Convergence of Cyclic Coordinate Descent Methods
- Stochastic Dual Coordinate Ascent Methods for Regularized Loss Minimization
- On the Convergence of Block Coordinate Descent Type Methods
- An Asynchronous Parallel Stochastic Coordinate Descent Algorithm
- Convex Analysis
- Convergence of a block coordinate descent method for nondifferentiable minimization
- Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization