Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization

From MaRDI portal
Publication:5962715


DOI10.1007/s10107-014-0839-0zbMath1342.90103arXiv1309.2375MaRDI QIDQ5962715

Tong Zhang, Shai Shalev-Shwartz

Publication date: 23 February 2016

Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1309.2375


90C25: Convex programming

90C06: Large-scale problems in mathematical programming

90C15: Stochastic programming


Related Items

Unnamed Item, Unnamed Item, Unnamed Item, Unnamed Item, Unnamed Item, Unnamed Item, Accelerated Methods for NonConvex Optimization, A Randomized Nonmonotone Block Proximal Gradient Method for a Class of Structured Nonlinear Programming, Surpassing Gradient Descent Provably: A Cyclic Incremental Method with Linear Convergence Rate, On Faster Convergence of Cyclic Block Coordinate Descent-type Methods for Strongly Convex Minimization, Catalyst Acceleration for First-order Convex Optimization: from Theory to Practice, Random Gradient Extrapolation for Distributed and Stochastic Optimization, Stochastic Primal-Dual Coordinate Method for Regularized Empirical Risk Minimization, A General Distributed Dual Coordinate Optimization Framework for Regularized Loss Minimization, Linear Coupling: An Ultimate Unification of Gradient and Mirror Descent, Unnamed Item, Unnamed Item, Unnamed Item, Unnamed Item, Unnamed Item, Unnamed Item, An Optimal High-Order Tensor Method for Convex Optimization, Active Subspace of Neural Networks: Structural Analysis and Universal Attacks, On the Complexity Analysis of the Primal Solutions for the Accelerated Randomized Dual Coordinate Ascent, Unnamed Item, Spectrum Approximation Beyond Fast Matrix Multiplication: Algorithms and Hardness, On the Convergence of Stochastic Primal-Dual Hybrid Gradient, On the Adaptivity of Stochastic Gradient-Based Optimization, A Smooth Inexact Penalty Reformulation of Convex Problems with Linear Constraints, An Optimal Algorithm for Decentralized Finite-Sum Optimization, An Inexact Variable Metric Proximal Point Algorithm for Generic Quasi-Newton Acceleration, Importance sampling in signal processing applications, Faster convergence of a randomized coordinate descent method for linearly constrained optimization problems, Kalman-Based Stochastic Gradient Method with Stop Condition and Insensitivity to Conditioning, Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization, A generic coordinate descent solver for non-smooth convex optimisation, A new randomized primal-dual algorithm for convex optimization with fast last iterate convergence rates, A dual-based stochastic inexact algorithm for a class of stochastic nonsmooth convex composite problems, An aggressive reduction on the complexity of optimization for non-strongly convex objectives, Block mirror stochastic gradient method for stochastic optimization, A safe double screening strategy for elastic net support vector machine, An inexact primal-dual smoothing framework for large-scale non-bilinear saddle point problems, Unifying framework for accelerated randomized methods in convex optimization, On optimal probabilities in stochastic coordinate descent methods, On the complexity analysis of randomized block-coordinate descent methods, An extragradient-based alternating direction method for convex minimization, A new accelerated algorithm for ill-conditioned ridge regression problems, Stochastic primal dual fixed point method for composite optimization, Top-\(k\) multi-class SVM using multiple features, Distributed block-diagonal approximation methods for regularized empirical risk minimization, High-dimensional model recovery from random sketched data by exploring intrinsic sparsity, The complexity of primal-dual fixed point methods for ridge regression, An optimal randomized incremental gradient method, Parallel decomposition methods for linearly constrained problems subject to simple bound with application to the SVMs training, Generalized stochastic Frank-Wolfe algorithm with stochastic ``substitute gradient for structured convex optimization, An accelerated directional derivative method for smooth stochastic convex optimization, A unified convergence analysis of stochastic Bregman proximal gradient and extragradient methods, Randomized smoothing variance reduction method for large-scale non-smooth convex optimization, A stochastic extra-step quasi-Newton method for nonsmooth nonconvex optimization, Linear convergence of cyclic SAGA, Primal-dual block-proximal splitting for a class of non-convex problems, Provable accelerated gradient method for nonconvex low rank optimization, Block-proximal methods with spatially adapted acceleration, Convergence rates of accelerated proximal gradient algorithms under independent noise, An accelerated variance reducing stochastic method with Douglas-Rachford splitting, Variance reduction for root-finding problems, Optimization in High Dimensions via Accelerated, Parallel, and Proximal Coordinate Descent, Accelerated, Parallel, and Proximal Coordinate Descent, An Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization


Uses Software


Cites Work