An Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization

From MaRDI portal
Publication:3451763

DOI10.1137/141000270zbMath1329.65127arXiv1407.1296OpenAlexW2163786124MaRDI QIDQ3451763

Qihang Lin, Zhaosong Lu, Lin Xiao

Publication date: 18 November 2015

Published in: SIAM Journal on Optimization (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1407.1296



Related Items

A penalty method for rank minimization problems in symmetric matrices, Block-coordinate and incremental aggregated proximal gradient methods for nonsmooth nonconvex problems, Accelerated, Parallel, and Proximal Coordinate Descent, On the Complexity Analysis of the Primal Solutions for the Accelerated Randomized Dual Coordinate Ascent, MAGMA: Multilevel Accelerated Gradient Mirror Descent Algorithm for Large-Scale Convex Composite Minimization, Unnamed Item, A Subspace Acceleration Method for Minimization Involving a Group Sparsity-Inducing Regularizer, Accelerated First-Order Methods for Convex Optimization with Locally Lipschitz Continuous Gradient, An aggressive reduction on the complexity of optimization for non-strongly convex objectives, Cyclic Coordinate Dual Averaging with Extrapolation, Randomized Block Proximal Damped Newton Method for Composite Self-Concordant Minimization, Iteration-Complexity of First-Order Augmented Lagrangian Methods for Convex Conic Programming, Worst-case complexity of cyclic coordinate descent: \(O(n^2)\) gap with randomized version, A Randomized Coordinate Descent Method with Volume Sampling, Two Symmetrized Coordinate Descent Methods Can Be $O(n^2)$ Times Slower Than the Randomized Version, An Optimal Algorithm for Decentralized Finite-Sum Optimization, Catalyst Acceleration for First-order Convex Optimization: from Theory to Practice, An introduction to continuous optimization for imaging, Generalized stochastic Frank-Wolfe algorithm with stochastic ``substitute gradient for structured convex optimization, Unnamed Item, Unnamed Item, Acceleration of primal-dual methods by preconditioning and simple subproblem procedures, An optimal randomized incremental gradient method, A parallel line search subspace correction method for composite convex optimization, On maximum residual block and two-step Gauss-Seidel algorithms for linear least-squares problems, An Efficient Inexact ABCD Method for Least Squares Semidefinite Programming, A remark on accelerated block coordinate descent for computing the proximity operators of a sum of convex functions, Restarting the accelerated coordinate descent method with a rough strong convexity estimate, Coordinate descent with arbitrary sampling I: algorithms and complexity, Coordinate descent with arbitrary sampling II: expected separable overapproximation, CoordinateWise Descent Methods for Leading Eigenvalue Problem, Optimization in High Dimensions via Accelerated, Parallel, and Proximal Coordinate Descent, Stochastic Primal-Dual Coordinate Method for Regularized Empirical Risk Minimization, Unnamed Item, Unnamed Item, An adaptive Polyak heavy-ball method, Coordinate descent algorithms


Uses Software


Cites Work