Coordinate descent with arbitrary sampling. I: Algorithms and complexity.
DOI10.1080/10556788.2016.1190360zbMATH Open1365.90205DBLPjournals/oms/QuR16arXiv1412.8060OpenAlexW2963434703WikidataQ56813559 ScholiaQ56813559MaRDI QIDQ2829565FDOQ2829565
Authors: Zheng Qu, Peter Richtárik
Publication date: 8 November 2016
Published in: Optimization Methods \& Software (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1412.8060
Recommendations
- Accelerated, parallel, and proximal coordinate descent
- A flexible coordinate descent method
- Coordinate descent with arbitrary sampling. II: Expected separable overapproximation.
- On the complexity analysis of randomized block-coordinate descent methods
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Analysis of algorithms and problem complexity (68Q25) Derivative-free methods and methods using generalized derivatives (90C56)
Cites Work
- Pegasos: primal estimated sub-gradient solver for SVM
- Introductory lectures on convex optimization. A basic course.
- Robust Stochastic Approximation Approach to Stochastic Programming
- Coordinate and subspace optimization methods for linear least squares with non-quadratic regularization
- Incremental majorization-minimization optimization with application to large-scale machine learning
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Parallel coordinate descent methods for big data optimization
- Efficiency of coordinate descent methods on huge-scale optimization problems
- Convergence Analysis of a Proximal-Like Minimization Algorithm Using Bregman Functions
- On optimal probabilities in stochastic coordinate descent methods
- Accelerated, parallel, and proximal coordinate descent
- A random coordinate descent algorithm for optimization problems with composite objective function and linear coupled constraints
- Coordinate descent algorithms
- A proximal stochastic gradient method with progressive variance reduction
- Interior Gradient and Proximal Methods for Convex and Conic Optimization
- Separable approximations and decomposition methods for the augmented Lagrangian
- Asynchronous stochastic coordinate descent: parallelism and convergence properties
- An accelerated randomized proximal coordinate gradient method and its application to regularized empirical risk minimization
Cited In (27)
- An attention algorithm for solving large scale structured \(l_0\)-norm penalty estimation problems
- Inexact variable metric stochastic block-coordinate descent for regularized optimization
- Title not available (Why is that?)
- Inexact coordinate descent: complexity and preconditioning
- Accelerating block coordinate descent methods with identification strategies
- Coordinate-friendly structures, algorithms and applications
- A coordinate-descent primal-dual algorithm with large step size and possibly nonseparable functions
- An accelerated randomized proximal coordinate gradient method and its application to regularized empirical risk minimization
- Title not available (Why is that?)
- Fastest rates for stochastic mirror descent methods
- Semi-stochastic coordinate descent
- Stochastic reformulations of linear systems: algorithms and convergence theory
- Momentum and stochastic momentum for stochastic gradient, Newton, proximal point and subspace descent methods
- Efficiency of the accelerated coordinate descent method on structured optimization problems
- Parallel random block-coordinate forward-backward algorithm: a unified convergence analysis
- A discrete dynamics approach to sparse calculation and applied in ontology science
- Local linear convergence of proximal coordinate descent algorithm
- Stochastic Primal-Dual Hybrid Gradient Algorithm with Arbitrary Sampling and Imaging Applications
- Proximal gradient methods with adaptive subspace sampling
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Optimization in high dimensions via accelerated, parallel, and proximal coordinate descent
- Convergence analysis of inexact randomized iterative methods
- Faster convergence of a randomized coordinate descent method for linearly constrained optimization problems
- Randomized iterative methods for linear systems
- Coordinate descent with arbitrary sampling. II: Expected separable overapproximation.
- A randomized coordinate descent method with volume sampling
- On the complexity of parallel coordinate descent
This page was built for publication: Coordinate descent with arbitrary sampling. I: Algorithms and complexity.
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2829565)