Inexact coordinate descent: complexity and preconditioning
DOI10.1007/s10957-016-0867-4zbMath1350.65062arXiv1304.5530OpenAlexW1673797905WikidataQ59462809 ScholiaQ59462809MaRDI QIDQ306308
Rachael Tappenden, Peter Richtárik, Jacek Gondzio
Publication date: 31 August 2016
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1304.5530
convex optimizationnumerical experimentspreconditioningconjugate gradientsblock coordinate descentiteration complexityinexact methods
Numerical computation of eigenvalues and eigenvectors of matrices (65F15) Numerical mathematical programming methods (65K05) Convex programming (90C25) Iterative numerical methods for linear systems (65F10) Complexity and performance of numerical algorithms (65Y20) Preconditioners for iterative methods (65F08)
Related Items
Uses Software
Cites Work
- Parallel coordinate descent methods for big data optimization
- Gradient methods for minimizing composite functions
- First-order methods of smooth convex optimization with inexact oracle
- A random coordinate descent algorithm for optimization problems with composite objective function and linear coupled constraints
- A box constrained gradient projection algorithm for compressed sensing
- Quadratic regularizations in an interior-point method for primal block-angular problems
- Stochastic intermediate gradient method for convex problems with stochastic inexact oracle
- A coordinate gradient descent method for nonsmooth separable minimization
- A randomized Kaczmarz algorithm with exponential convergence
- Parallel interior-point solver for structured linear programs
- Introductory lectures on convex optimization. A basic course.
- The self regulation problem as an inexact steepest descent method for multicriteria optimization
- On the convergence of inexact block coordinate descent methods for constrained optimization
- Efficient block-coordinate descent algorithms for the group Lasso
- Parallel stochastic gradient algorithms for large-scale matrix completion
- Paved with good intentions: analysis of a randomized block Kaczmarz method
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Exact matrix completion via convex optimization
- Accelerated Block-coordinate Relaxation for Regularized Optimization
- Standardization and the Group Lasso Penalty
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Rate Analysis of Inexact Dual First-Order Methods Application to Dual Decomposition
- The university of Florida sparse matrix collection
- Inexact block coordinate descent methods with application to non-negative matrix factorization
- On A Class of Limited Memory Preconditioners For Large Scale Linear Systems With Multiple Right-Hand Sides
- Randomized Methods for Linear Constraints: Convergence Rates and Conditioning
- Numerical solution of saddle point problems
- An Interior Point Method for Block Angular Optimization
- Inexact Preconditioned Conjugate Gradient Method with Inner-Outer Iteration
- Sparse Reconstruction by Separable Approximation
- Efficient Serial and Parallel Coordinate Descent Methods for Huge-Scale Truss Topology Design
- On the Nonasymptotic Convergence of Cyclic Coordinate Descent Methods
- Stochastic Dual Coordinate Ascent Methods for Regularized Loss Minimization
- Methods of conjugate gradients for solving linear systems
- Compressed sensing
- Convergence of a block coordinate descent method for nondifferentiable minimization