Coordinate descent with arbitrary sampling. I: Algorithms and complexity.
From MaRDI portal
Publication:2829565
Abstract: We study the problem of minimizing the sum of a smooth convex function and a convex block-separable regularizer and propose a new randomized coordinate descent method, which we call ALPHA. Our method at every iteration updates a random subset of coordinates, following an arbitrary distribution. No coordinate descent methods capable to handle an arbitrary sampling have been studied in the literature before for this problem. ALPHA is a remarkably flexible algorithm: in special cases, it reduces to deterministic and randomized methods such as gradient descent, coordinate descent, parallel coordinate descent and distributed coordinate descent -- both in nonaccelerated and accelerated variants. The variants with arbitrary (or importance) sampling are new. We provide a complexity analysis of ALPHA, from which we deduce as a direct corollary complexity bounds for its many variants, all matching or improving best known bounds.
Recommendations
- Accelerated, parallel, and proximal coordinate descent
- A flexible coordinate descent method
- Coordinate descent with arbitrary sampling. II: Expected separable overapproximation.
- On the complexity analysis of randomized block-coordinate descent methods
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
Cites work
- A proximal stochastic gradient method with progressive variance reduction
- A random coordinate descent algorithm for optimization problems with composite objective function and linear coupled constraints
- Accelerated, parallel, and proximal coordinate descent
- An accelerated randomized proximal coordinate gradient method and its application to regularized empirical risk minimization
- Asynchronous stochastic coordinate descent: parallelism and convergence properties
- Convergence Analysis of a Proximal-Like Minimization Algorithm Using Bregman Functions
- Coordinate and subspace optimization methods for linear least squares with non-quadratic regularization
- Coordinate descent algorithms
- Efficiency of coordinate descent methods on huge-scale optimization problems
- Incremental majorization-minimization optimization with application to large-scale machine learning
- Interior Gradient and Proximal Methods for Convex and Conic Optimization
- Introductory lectures on convex optimization. A basic course.
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- On optimal probabilities in stochastic coordinate descent methods
- Parallel coordinate descent methods for big data optimization
- Pegasos: primal estimated sub-gradient solver for SVM
- Robust Stochastic Approximation Approach to Stochastic Programming
- Separable approximations and decomposition methods for the augmented Lagrangian
Cited in
(27)- An attention algorithm for solving large scale structured \(l_0\)-norm penalty estimation problems
- Inexact variable metric stochastic block-coordinate descent for regularized optimization
- Inexact coordinate descent: complexity and preconditioning
- scientific article; zbMATH DE number 6982318 (Why is no real title available?)
- Accelerating block coordinate descent methods with identification strategies
- Coordinate-friendly structures, algorithms and applications
- A coordinate-descent primal-dual algorithm with large step size and possibly nonseparable functions
- Fastest rates for stochastic mirror descent methods
- An accelerated randomized proximal coordinate gradient method and its application to regularized empirical risk minimization
- scientific article; zbMATH DE number 7370629 (Why is no real title available?)
- Semi-stochastic coordinate descent
- Stochastic reformulations of linear systems: algorithms and convergence theory
- Momentum and stochastic momentum for stochastic gradient, Newton, proximal point and subspace descent methods
- Efficiency of the accelerated coordinate descent method on structured optimization problems
- Parallel random block-coordinate forward-backward algorithm: a unified convergence analysis
- A discrete dynamics approach to sparse calculation and applied in ontology science
- Stochastic Primal-Dual Hybrid Gradient Algorithm with Arbitrary Sampling and Imaging Applications
- Local linear convergence of proximal coordinate descent algorithm
- Proximal gradient methods with adaptive subspace sampling
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Optimization in high dimensions via accelerated, parallel, and proximal coordinate descent
- Convergence analysis of inexact randomized iterative methods
- Faster convergence of a randomized coordinate descent method for linearly constrained optimization problems
- Coordinate descent with arbitrary sampling. II: Expected separable overapproximation.
- Randomized iterative methods for linear systems
- A randomized coordinate descent method with volume sampling
- On the complexity of parallel coordinate descent
This page was built for publication: Coordinate descent with arbitrary sampling. I: Algorithms and complexity.
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2829565)