Coordinate descent with arbitrary sampling I: algorithms and complexity†
From MaRDI portal
Publication:2829565
DOI10.1080/10556788.2016.1190360zbMath1365.90205arXiv1412.8060OpenAlexW2963434703WikidataQ56813559 ScholiaQ56813559MaRDI QIDQ2829565
Publication date: 8 November 2016
Published in: Optimization Methods and Software (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1412.8060
Analysis of algorithms and problem complexity (68Q25) Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Derivative-free methods and methods using generalized derivatives (90C56)
Related Items (17)
An attention algorithm for solving large scale structured \(l_0\)-norm penalty estimation problems ⋮ Parallel random block-coordinate forward-backward algorithm: a unified convergence analysis ⋮ Randomized Iterative Methods for Linear Systems ⋮ Accelerating block coordinate descent methods with identification strategies ⋮ Unnamed Item ⋮ Stochastic Reformulations of Linear Systems: Algorithms and Convergence Theory ⋮ Local linear convergence of proximal coordinate descent algorithm ⋮ Faster convergence of a randomized coordinate descent method for linearly constrained optimization problems ⋮ On the complexity of parallel coordinate descent ⋮ Unnamed Item ⋮ Momentum and stochastic momentum for stochastic gradient, Newton, proximal point and subspace descent methods ⋮ Stochastic Primal-Dual Hybrid Gradient Algorithm with Arbitrary Sampling and Imaging Applications ⋮ A discrete dynamics approach to sparse calculation and applied in ontology science ⋮ Fastest rates for stochastic mirror descent methods ⋮ Optimization in High Dimensions via Accelerated, Parallel, and Proximal Coordinate Descent ⋮ Convergence Analysis of Inexact Randomized Iterative Methods ⋮ Proximal Gradient Methods with Adaptive Subspace Sampling
Cites Work
- Parallel coordinate descent methods for big data optimization
- On optimal probabilities in stochastic coordinate descent methods
- A random coordinate descent algorithm for optimization problems with composite objective function and linear coupled constraints
- Pegasos: primal estimated sub-gradient solver for SVM
- Introductory lectures on convex optimization. A basic course.
- Coordinate descent algorithms
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Coordinate and subspace optimization methods for linear least squares with non-quadratic regularization
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Separable approximations and decomposition methods for the augmented Lagrangian
- Asynchronous Stochastic Coordinate Descent: Parallelism and Convergence Properties
- Convergence Analysis of a Proximal-Like Minimization Algorithm Using Bregman Functions
- Accelerated, Parallel, and Proximal Coordinate Descent
- An Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization
- Robust Stochastic Approximation Approach to Stochastic Programming
- A Proximal Stochastic Gradient Method with Progressive Variance Reduction
- Incremental Majorization-Minimization Optimization with Application to Large-Scale Machine Learning
- Interior Gradient and Proximal Methods for Convex and Conic Optimization
This page was built for publication: Coordinate descent with arbitrary sampling I: algorithms and complexity†