An Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization

From MaRDI portal
Revision as of 20:01, 4 February 2024 by Import240129110113 (talk | contribs) (Created automatically from import240129110113)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:3451763

DOI10.1137/141000270zbMath1329.65127arXiv1407.1296OpenAlexW2163786124MaRDI QIDQ3451763

Qihang Lin, Zhaosong Lu, Lin Xiao

Publication date: 18 November 2015

Published in: SIAM Journal on Optimization (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1407.1296




Related Items (37)

A penalty method for rank minimization problems in symmetric matricesBlock-coordinate and incremental aggregated proximal gradient methods for nonsmooth nonconvex problemsAccelerated, Parallel, and Proximal Coordinate DescentOn the Complexity Analysis of the Primal Solutions for the Accelerated Randomized Dual Coordinate AscentMAGMA: Multilevel Accelerated Gradient Mirror Descent Algorithm for Large-Scale Convex Composite MinimizationUnnamed ItemA Subspace Acceleration Method for Minimization Involving a Group Sparsity-Inducing RegularizerAccelerated First-Order Methods for Convex Optimization with Locally Lipschitz Continuous GradientAn aggressive reduction on the complexity of optimization for non-strongly convex objectivesCyclic Coordinate Dual Averaging with ExtrapolationRandomized Block Proximal Damped Newton Method for Composite Self-Concordant MinimizationIteration-Complexity of First-Order Augmented Lagrangian Methods for Convex Conic ProgrammingWorst-case complexity of cyclic coordinate descent: \(O(n^2)\) gap with randomized versionA Randomized Coordinate Descent Method with Volume SamplingTwo Symmetrized Coordinate Descent Methods Can Be $O(n^2)$ Times Slower Than the Randomized VersionAn Optimal Algorithm for Decentralized Finite-Sum OptimizationCatalyst Acceleration for First-order Convex Optimization: from Theory to PracticeAn introduction to continuous optimization for imagingGeneralized stochastic Frank-Wolfe algorithm with stochastic ``substitute gradient for structured convex optimizationUnnamed ItemUnnamed ItemAcceleration of primal-dual methods by preconditioning and simple subproblem proceduresAn optimal randomized incremental gradient methodA parallel line search subspace correction method for composite convex optimizationOn maximum residual block and two-step Gauss-Seidel algorithms for linear least-squares problemsAn Efficient Inexact ABCD Method for Least Squares Semidefinite ProgrammingA remark on accelerated block coordinate descent for computing the proximity operators of a sum of convex functionsRestarting the accelerated coordinate descent method with a rough strong convexity estimateCoordinate descent with arbitrary sampling I: algorithms and complexityCoordinate descent with arbitrary sampling II: expected separable overapproximationCoordinateWise Descent Methods for Leading Eigenvalue ProblemOptimization in High Dimensions via Accelerated, Parallel, and Proximal Coordinate DescentStochastic Primal-Dual Coordinate Method for Regularized Empirical Risk MinimizationUnnamed ItemUnnamed ItemAn adaptive Polyak heavy-ball methodCoordinate descent algorithms


Uses Software



Cites Work




This page was built for publication: An Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization