Parallel coordinate descent methods for big data optimization
DOI10.1007/s10107-015-0901-6zbMath1342.90102arXiv1212.0873OpenAlexW2032395696WikidataQ59476352 ScholiaQ59476352MaRDI QIDQ263212
Publication date: 4 April 2016
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1212.0873
convex optimizationlassoiteration complexitybig data optimizationcomposite objectiveexpected separable over-approximationhuge-scale optimizationparallel coordinate descentpartial separability
Analysis of algorithms (68W40) Numerical mathematical programming methods (65K05) Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Parallel algorithms in computer science (68W10) Decomposition methods (49M27) Randomized algorithms (68W20) Numerical methods of relaxation type (49M20)
Related Items
Uses Software
Cites Work
- Unnamed Item
- Inexact coordinate descent: complexity and preconditioning
- On optimal probabilities in stochastic coordinate descent methods
- Gradient methods for minimizing composite functions
- Subgradient methods for huge-scale optimization problems
- A random coordinate descent algorithm for optimization problems with composite objective function and linear coupled constraints
- A randomized Kaczmarz algorithm with exponential convergence
- Introductory lectures on convex optimization. A basic course.
- Coordinate descent optimization for \(l^{1}\) minimization with application to compressed sensing; a greedy algorithm
- Smooth minimization of nonsmooth functions with parallel coordinate descent methods
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Distributed Coordinate Descent Method for Learning with Big Data
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Accelerated Dual Descent for Network Flow Optimization
- Randomized Methods for Linear Constraints: Convergence Rates and Conditioning
- Parallel Random Coordinate Descent Method for Composite Minimization: Convergence Analysis and Error Bounds
- Iterative Methods by Space Decomposition and Subspace Correction
- On Convergence of an Augmented Lagrangian Decomposition Method for Sparse Convex Optimization
- Efficient Serial and Parallel Coordinate Descent Methods for Huge-Scale Truss Topology Design
- On the Nonasymptotic Convergence of Cyclic Coordinate Descent Methods