Inexact coordinate descent: complexity and preconditioning
DOI10.1007/S10957-016-0867-4zbMATH Open1350.65062DBLPjournals/jota/TappendenRG16arXiv1304.5530OpenAlexW1673797905WikidataQ59462809 ScholiaQ59462809MaRDI QIDQ306308FDOQ306308
Authors: Rachael Tappenden, Peter Richtárik, Jacek Gondzio
Publication date: 31 August 2016
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1304.5530
Recommendations
- Efficiency of coordinate descent methods on huge-scale optimization problems
- A flexible coordinate descent method
- On the complexity analysis of randomized block-coordinate descent methods
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Coordinate descent with arbitrary sampling. I: Algorithms and complexity.
convex optimizationpreconditioningnumerical experimentsiteration complexityblock coordinate descentconjugate gradientsinexact methods
Numerical mathematical programming methods (65K05) Convex programming (90C25) Complexity and performance of numerical algorithms (65Y20) Numerical computation of eigenvalues and eigenvectors of matrices (65F15) Preconditioners for iterative methods (65F08) Iterative numerical methods for linear systems (65F10)
Cites Work
- The university of Florida sparse matrix collection
- Parallel interior-point solver for structured linear programs
- Convergence of a block coordinate descent method for nondifferentiable minimization
- A randomized Kaczmarz algorithm with exponential convergence
- Introductory lectures on convex optimization. A basic course.
- Gradient methods for minimizing composite functions
- Exact matrix completion via convex optimization
- Parallel stochastic gradient algorithms for large-scale matrix completion
- Methods of conjugate gradients for solving linear systems
- A coordinate gradient descent method for nonsmooth separable minimization
- Standardization and the group lasso penalty
- Compressed sensing
- Sparse Reconstruction by Separable Approximation
- Rate Analysis of Inexact Dual First-Order Methods Application to Dual Decomposition
- Numerical solution of saddle point problems
- First-order methods of smooth convex optimization with inexact oracle
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Parallel coordinate descent methods for big data optimization
- Accelerated block-coordinate relaxation for regularized optimization
- Efficiency of coordinate descent methods on huge-scale optimization problems
- Randomized methods for linear constraints: convergence rates and conditioning
- A random coordinate descent algorithm for optimization problems with composite objective function and linear coupled constraints
- Efficient Serial and Parallel Coordinate Descent Methods for Huge-Scale Truss Topology Design
- On the Nonasymptotic Convergence of Cyclic Coordinate Descent Methods
- Inexact block coordinate descent methods with application to non-negative matrix factorization
- On A Class of Limited Memory Preconditioners For Large Scale Linear Systems With Multiple Right-Hand Sides
- Stochastic Dual Coordinate Ascent Methods for Regularized Loss Minimization
- The self regulation problem as an inexact steepest descent method for multicriteria optimization
- On the convergence of inexact block coordinate descent methods for constrained optimization
- Efficient block-coordinate descent algorithms for the group Lasso
- Paved with good intentions: analysis of a randomized block Kaczmarz method
- An Interior Point Method for Block Angular Optimization
- Inexact Preconditioned Conjugate Gradient Method with Inner-Outer Iteration
- A box constrained gradient projection algorithm for compressed sensing
- Quadratic regularizations in an interior-point method for primal block-angular problems
- Stochastic intermediate gradient method for convex problems with stochastic inexact oracle
Cited In (25)
- Inexact variable metric stochastic block-coordinate descent for regularized optimization
- Zeroth-Order Regularized Optimization (ZORO): Approximately Sparse Gradients and Adaptive Sampling
- Schwarz iterative methods: infinite space splittings
- Parallel coordinate descent methods for big data optimization
- Riemannian preconditioned coordinate descent for low multilinear rank approximation
- On optimal probabilities in stochastic coordinate descent methods
- A Randomized Nonmonotone Block Proximal Gradient Method for a Class of Structured Nonlinear Programming
- A random coordinate descent algorithm for optimization problems with composite objective function and linear coupled constraints
- Accelerating block coordinate descent methods with identification strategies
- Distributed Block Coordinate Descent for Minimizing Partially Separable Functions
- An inexact successive quadratic approximation method for L-1 regularized optimization
- A block coordinate variable metric forward-backward algorithm
- A flexible coordinate descent method
- Convergence Analysis of Inexact Randomized Iterative Methods
- On the complexity analysis of randomized block-coordinate descent methods
- Fast and safe: accelerated gradient methods with optimality certificates and underestimate sequences
- Unifying framework for accelerated randomized methods in convex optimization
- Optimization in high dimensions via accelerated, parallel, and proximal coordinate descent
- Accelerated, Parallel, and Proximal Coordinate Descent
- A block symmetric Gauss-Seidel decomposition theorem for convex composite quadratic programming and its applications
- Title not available (Why is that?)
- A Subspace Acceleration Method for Minimization Involving a Group Sparsity-Inducing Regularizer
- On the complexity of parallel coordinate descent
- Computing locally injective mappings by advanced MIPS
- Computing the Best Approximation over the Intersection of a Polyhedral Set and the Doubly Nonnegative Cone
Uses Software
This page was built for publication: Inexact coordinate descent: complexity and preconditioning
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q306308)